Eight Things Congress Doesn’t Get About Facebook

April 16, 2018

(Ed. Note: The “Innovation Fact of the Week” appears as a regular feature in each edition of ITIF’s weekly email newsletter. Sign up today.)

Last week, Congress held two hearings with the CEO of Facebook, Mark Zuckerberg, on the company’s data collection and privacy policies following the Cambridge Analytica scandal. While some lawmakers prepared for the high-profile hearings with well thought-out questions, some seemed to misunderstand how social networks and the digital economy works.

Here are eight key areas of how members of Congress misunderstood the reality of the technology, business model, and actions of Facebook

1.    Facebook Does Not Sell Your Data to Advertisers

Many people mistakenly believe that Facebook sells its customers data. It does not.

Yet several members of Congress seem to be under the impression that advertisers buy data from Facebook. For example, Sen. Dean Heller (R-NV) asked Zuckerberg, “Have you ever drawn the line on selling data to an advertiser?” It is understandable that some members got this wrong—after all, major news outlets like The Atlantic have published completely misleading descriptions of the social network’s data practices.

So how does advertising on Facebook work? Rather than sell personal data to advertisers, Facebook keeps control of this information and only sells advertisers access to users’ newsfeeds. Advertisers pay to reach an audience based on several factors, such as geographic location, particular interests or characteristics, and behaviors—including the use of other online services. For example, an advertiser could pay to reach politically-inclined males in the District of Colombia metro area. However, these advertisers can only access details about users who see their advertising campaign in the aggregate. For example, advertisers could see that their ad reached 500 males in the DC area, but they would be unable to see individual information, such as who those individuals are and if any of them are Members of Congress.

In no part of this process does Facebook allow advertisers to access personal information on its users.

2.    Users Choose to Put Their Information on Facebook

Some people claim Facebook is engaged in mass surveillance. This is clearly a false attack.

For example, Rep. Bobby Rush (D-IL) went so far as to suggest that “Facebook’s methodology” was similar to the Federal Bureau of Investigations under J. Edgar Hoover due to its “surveillance” capabilities.

However, sharing data on Facebook is a feature, not a bug. Each time users post pictures or updates their news feed with a pithy observation or life update, they affirmatively choose to upload that information to the site. And they do so because they want to share that information with other people. Moreover, Facebook gives individuals the ability to restrict who sees this information through their granular privacy controls, such as restricted friend’s lists. And if users no longer wish to store any of that information on the platform, they can delete it. According to Zuckerberg, Facebook does not keep any of this deleted information.

That is the reason social media should not be compared to government surveillance operations: surveillance isn’t voluntary.

3.    Congress Does Not Know More than Facebook About User Interface Design

Some members seemed convinced that they had a simple fix to improve Facebook’s user interface. It’s not so simple.

Sens. Richard Blumenthal (D-CT) and Ed Markey (D-MA) both pressed Zuckerberg on whether he would support a single affirmative consent standard for data sharing on Facebook. Rep. Mimi Walters (R-CA) said, “it seems to me that putting all privacy control options in a single location would be more user-friendly.” And Rep. Frank Pallone (D-NJ) demanded of Zuckerberg, “Yes or No: Will you commit to changing all user default settings to minimize, to the greatest extent possible, the collection and user — and use of users' data?”

Ironically, Facebook often faces complaints that there are both too many controls and that users do not have enough control over their personal information. Finding the right balance is not simple. The company spends a considerable amount of time and resources testing its user interface design for different users and different types of devices. Facebook continuously iterates on its design for many different aspects of its service, from changing how users see relevant content to simplifying privacy controls.

Some of these changes can add more complexity by giving users more features to control their information, which can be confusing to users familiar with the old features. Facebook certainly can do more to explain changes to users and helping them update their privacy settings, but many companies struggle with this challenge and Facebook seems to be getting better at addressing it. But given that the platform serves 2.2 billion users worldwide, many with different levels of literacy, computer experience, and types of devices, not to mention different preferences on how they share personal data, finding an optimal solution is a moving target.

4.    Privacy Policies Are for Lawyers

Many lawmakers do not understand why Facebook’s user agreement was not easier for the average consumer to understand.

Rep. Michael Burgess (R-TX) asked, “Can the average layperson look at the terms and conditions and make the evaluation, ‘Is this a strong enough protection for me to enter into this arrangement’?” And Rep. Billy Long (R-MI) lamented the fact that people don’t read terms of service agreements, saying “…if we invited everyone that had read your terms of agreement—terms of service, we could probably fit them at that table.”

What lawmakers seem to miss is the fact that these agreements are so complex for two reasons. First, platforms like Facebook are complex businesses. Second, they are written by and for lawyers because doing otherwise exposes companies to penalties for non-compliance. If companies make even a simple mistake or omission, they could be penalized by regulators like the Federal Trade Commission (FTC). For example, when Nomi Technologies, a company that provides in-store retail analytics, innocuously made a partially incorrect statement in its privacy policy, the company was subjected to a 20-year consent decree by the FTC.

Moreover, these user agreements do not need to be simplified because competitors as well as consumer groups, such as the Electronic Frontier Foundation and the Electronic Privacy Information Center, read and monitor companies’ terms of service and highlight any problems they see. If companies do not adhere to their terms, regulators can and do step in and hold companies accountable.

5.    Online Services Do Not Treat Consumers Badly Because They Aren’t Paying

Many people often use the dismissive phrase “if you’re not paying, you are the product” to suggest that free online services treat their users badly.

This concept popped up a number of times throughout the hearings. For example, Rep. Paul Tonko (D-NY) claimed that in Facebook’s business model “users are the product.” Similarly, Sen. Ron Johnson (R-WI) asked a question about changing Facebook’s business model to a monthly subscription model.

The implication of these lines of questioning is that paying for a service somehow aligns the companies interests with that of the consumer more than a service where the primary revenue source comes from a third-party, such as an advertiser. That is not always the case. (See, for example, the airline industry.) Whether a company’s business model is subscription-based, ad-based, or a combination of the two has no bearing on how they treat their consumers.

Moreover, Facebook generates value because it serves a two-sided market, connecting consumers and advertisers. Users get access to a free service and advertisers get access to an audience for its ads. But this model only works if Facebook creates a platform that keeps consumers engaged. So Facebook has a strong incentive to meet the needs of consumers.

6.    Facebook Shouldn’t be Responsible for Everything its Users’ Post

Some lawmakers think Facebook should be responsible for everything users post on its platform.

For example, Rep. Cathy McMorris Rodgers (R-WA) asked, “So, Mr. Zuckerberg, given the extensive reach of Facebook and its widespread use as a tool of public expression, do you think Facebook has a unique responsibility to ensure that it has clear standards regarding the censorship of content on its platform?” Similarly, Rep. David McKinley (R-WV) questioned the CEO about illegal pharmacies advertising over Facebook, asking “So my question to you is, when are you going to stop and take down these posts that are on illegal digital pharmacies? …if you got all these 20,000 people [reviewing these ads]—you know that they're up there.”

Right now, takedowns occur when users on the website report the post to Facebook, but this process will get better as Facebook develops artificial intelligence tools to automatically identify and remove prohibited content. The problem is automatically identifying the correct information to take down is not easy. However, over time and with lots of trial and error, platforms will be able to more effectively use algorithms to take down some prohibited content and prioritize how items are displayed in news feeds. For example, Facebook is very effective at automatically flagging and removing terrorist content. But this process can only work if platforms’ minimal legal liability allows a wide margin for error, which leaves open the opportunity to gradually improve these tools.

If lawmakers increasing their liability, platforms would likely remove content if there is any doubt at all that it is “legal.” Moreover, companies like Facebook are “damned if they do, and damned if they don’t.”  If they take down too little content they are criticized as enabling all sorts of social harms.  But if they take down too much, they are attacked for infringing on free speech.

7.    Sharing Data Outside Facebook is Not Inherently Bad

Many lawmakers falsely suggested that sharing personal data is always a bad thing.

For example, Rep. Dave Loebsack (D-IL) asked, “Is it possible for you to be in business without sharing the data? Because that's what you have done, whether it was selling or not—sharing the data, providing it to Cambridge Analytica and other folks along the way.”

While data should not be misused—and clearly Cambridge Analytica misused the data it obtained without permission from Facebook—a blanket restriction on data sharing could hurt legitimate research.

Facebook is incredibly useful to scientific research due to its large user base and granular data about social interactions and activity. To support these academic efforts, the company launched a research program in 2009, and since that time, has supported increasing amounts of such research. The benefits of this research are substantial. For example, Colorado State University researchers used Facebook data to improve traditional methods for tracking human smoke exposure from wildfires.

Restricting access to data on that platform would have a significant impact on research. For example, the company recently halted a program that could have shared anonymized user profiles with hospitals to improve medical outcomes in order to focus on “doing a better job of protecting people’s data.” Presumably, this decision is in part because of misconceptions surrounding how Facebook shares data in support of science.

8.    Facebook Does Have Competitors

Several members of Congress argued that Facebook was a monopoly.

For example, Rep. Fred Upton (R-MI) said, “And we know, of course, that you're the dominant social media platform without any true competitor, in all frankness.” Similarly, Sen. Lindsey Graham (R-SC) asked a series of pointed questions about Facebook’s competitors, asking, “If I buy a Ford, and it doesn't work well, and I don't like it, I can buy a Chevy. If I'm upset with Facebook, what's the equivalent product that I can go sign up for?”

Unfortunately, arguments about Facebook’s level of competition miss the point. Market competition is about more than one firm doing the exact same thing as another firm. For example, Blockbuster Video directly competed with Hollywood Video in the market for renting movies. However, it failed to miss its real threat: Netflix.

Facebook wears many different hats and competes in several markets. In the ad market—the only relevant market from an anti-trust perspective given that Facebook is a free service—Facebook is absolutely not a monopoly. Facebook competes with Google, Microsoft, Amazon, Twitter, Yelp, Snapchat, and more, controlling 20.9 percent of the U.S. digital ad market in 2017. And the digital ad market is just a subset of the larger advertising market. And in each of its ventures Facebook competes against other businesses. For example, Rep. Greg Walden (R-OR) cited a series of videos that Facebook produced and asked the CEO if Facebook is a media company. If this is indeed the case, then Facebook competes for the attention of users with dozens of services, such as Amazon, Netflix, and YouTube, for user attention with its videos.

Conclusion

Members of Congress shouldn’t feel too bad, if for no other reason that many misunderstand or misrepresent how social networks and the digital economy work. Regardless, bad data lead to bad decisions. Congress should get all its facts straight about Facebook before it decides what, if anything, it should do next. To do otherwise would risk unintended and probably harmful consequences from policy proposals that misunderstand the fundamentals of the digital economy.