Facebook Study Says Users Control What They See, But Critics Disagree

In the end, the slant of the news you see may be the slant of the news you seek.

In the end, the slant of the news you see may be the slant of the news you seek. Even if most Facebook users still don’t know that their news feeds are filtered by an algorithm, they may have more influence over their feeds than they realize.

That was the finding of a controversial peer-reviewed study published in Science Magazine last week by three Facebook data scientists, who analyzed 10.1 million accounts to determine how often people were exposed to political views different from their own. The data scientists concluded that if people on the world's largest social network mostly see news and updates from friends who support their own political ideology, it's primarily because of their own choices -- not the company's algorithm.

Although Facebook’s News Feed algorithm “surfaces content that is slightly more aligned with an individual's own ideology (based on that person's actions on Facebook)," the authors wrote in a blog post outlining their findings, "who they friend and what content they click on are more consequential than the News Feed ranking in terms of how much diverse content they encounter.”

The scientists' conclusion -- and the way they reached it -- caused an uproar, with academics criticizing the methodology and cautioning that the risk of polarized "echo chambers" on Facebook remains real. Facebook's algorithm has long been scrutinized by political scientists and open government advocates who worry that Facebook and other modern communications technologies increasingly enable people to insulate themselves from news and views different from their own.

One of the scientists behind the study defended the research against its critics in an interview with The Huffington Post.

“What we do show, very definitively, is that individuals do have diverse social networks,” said Eytan Bakshy, who wrote the study along with Lada Adamic and Solomon Messing. “The majority of people do have more friends from the other side than many have speculated. We're putting facts behind this.”

According to the researchers' analysis of Facebook users who self-report a liberal or conservative affiliation (more on that later):

  • "On average, 23 percent of people's friends claim an opposing political ideology.
  • Of the hard news content that people's friends share, 29.5 percent of it cuts across ideological lines.
  • When it comes to what people see in the News Feed, 28.5 percent of the hard news encountered cuts across ideological lines, on average.
  • 24.9 percent of the hard news content that people actually clicked on was cross-cutting."

The following figure from the study illustrates the authors' conclusions:

Source: Science Magazine. "Fig. 3 (A) Illustration of how algorithmic ranking and individual choice affects the proportion of ideologically cross-cutting content that individuals encounter. Gray circles illustrate the content present at each stage in the media exposure process. (B) Average ideological diversity of content (i) shared by random others (random), (ii) shared by friends (potential from network), (iii) actually appeared in users' News Feeds (exposed), (iv) users clicked on (selected)."

The controversy over the study reflects a political and social reality in the U.S.: As the electorate has become increasingly politically polarized over recent decades, that increased polarization is driving polarized media consumption, which in turn drives more polarized views. And if using Facebook exacerbates or accelerates polarization in the electorate, skeptics believe "hyper-partisanship" will get worse, further hindering the ability of lawmakers to compromise.

2015-05-08-1431108646-1554095-polarizationpoliticallyengagedus.png

Although consumers have long preferred newspapers, talk radio and cable TV networks that share their political views, the rise of digital platforms like Facebook gives people far more tools to rank and personalize what they see. The platforms themselves are using algorithms to shape what users see on their screens, big and small.

On the one hand, better software can give users powerful tools to navigate a firehose of pictures, videos and text updates from friends and the never-ending stream of news. Facebook specifically emphasizes the ability of users to independently make choices to handle information overload.

"Part of this is somewhat orthogonal, in terms of what users are seeing,” said Bakshy. “We are showing that they have agency within the context of what they see. Individuals don't have time and attention. If something is 'filtered out,' it's because they're not spending enough time on Facebook."

On the other hand, Facebook's power to influence what information people are exposed to has put the social networking site in the spotlight. Indeed, Facebook itself has an interest in convincing people it doesn't contribute to polarization. The social network has well over a billion users worldwide. Its immense reach and the engagement of its users (American users now spend an average of 40 minutes every day on the site) gives it immense power -- and that power has some people worried.

Nathan Jurgenson, a sociologist and researcher for Snapchat, argues that whether or not Facebook is fair and balanced in what it shows users is crucially important.

"Facebook's ideological push to dismiss the very real ways they structure what users see and do is the company attempting to simultaneously embrace control and evade responsibility," he wrote after the Facebook study came out. "Their news team doesn't need to be competent in journalism because they don't see themselves as doing journalism. But Facebook is doing journalism, and the way they code their algorithms and the rest of the site is structuring and shaping personal, social, and civic life."

"Like it or not," Jurgenson continued, "we've collectively handed very important civic roles to social media companies, the one I work for included, and a real danger is that we can't hope for them to be competent at these jobs when they wont even admit to doing them."

Zeynep Tufekci, a professor at the University of North Carolina at Chapel Hill, says that if Facebook's algorithm is modestly suppressing content diversity, it matters how and for whom that's happening.

"Facebook researchers are not studying some neutral phenomenon that exists outside of Facebook’s control. The algorithm is designed by Facebook, and is occasionally re-arranged, sometimes to the devastation of groups who cannot pay-to-play for that all important positioning," she wrote on Medium. "I’m glad that Facebook is choosing to publish such findings, but I cannot but shake my head about how the real findings are buried, and irrelevant comparisons take up the conclusion."

Tufekci also authored a column for HuffPost in which she took the tech press to task for its coverage of the study, highlighting findings in the report that she says support the argument that Facebook's algorithm contributes to a problematic "echo chamber" effect.

"The study found that the algorithm suppresses the diversity of the content you see in your feed by occasionally hiding items that you may disagree with and letting through the ones you are likely to agree with," she wrote. "The effect wasn't all or nothing: for self-identified liberals, one in 13 diverse news stories were removed, for example. Overall, this confirms what many of us had suspected: that the Facebook algorithm is biased towards producing agreement, not dissent."

Even simple features can have far-reaching effects, like improving voter turnout. As Harvard Law School professor Jonathan Zittrain wrote last year, Facebook could decide an election without anyone knowing it by notifying some users that their friends had voted but not others. While no evidence of “digital gerrymandering” has ever come to light, there's enough risk to make people keep a close eye on the implementation of Facebook's efforts to register people to vote and encourage them to participate in the democratic process.

As more and more news discovery occurs on Facebook, what people see or share has a growing effect on public knowledge and society writ large. That means that what happens there is worth watching carefully. If Facebook opens up more of its data to independent researchers and is transparent about the conditions its own scientists work under, observers believe the public interest value of the work will be enhanced.

To their credit, Adamic, Bakshy and Messing published the data and code used in their study online so that other researchers can replicate it. But some academics have been critical of how the scientists conducted, framed or published their research.

David Lazer, a professor at Northeastern University, gave Facebook points for conducting the study -- but warned that there is a need for independent analysis of social networks' data.

“It is laudable that Facebook supported this research and has invested in the public good of general scientific knowledge. Indeed, the information age hegemons should proactively support research on the ethical implications of the systems that they build,” Lazer wrote in Science.

“Facebook deserves great credit for building a talented research group and for conducting this research in a public way," he continued. "But there is a broader need for scientists to study these systems in a manner that is independent of the Facebooks of the world. There will be a need at times to speak truth to power, for knowledgeable individuals with appropriate data and analytic skills to act as social critics of this new social order."

Eszter Hargittai, a professor in the Communication Studies Department and Faculty Associate of the Institute for Policy Research at Northwestern University, questioned why the methodology was not more prominently featured, noting that the paper’s supplementary materials contained the key detail that the research was not representative of all Facebook users -- just frequent users who identified their political leanings.

"This means that only about 4% of FB users were eligible for the study," she said, "but it's even less than that, because the user had to log in at least '4/7 days per week', which removes approximately 30% of users."

Bakshy addressed this criticism in the interview with HuffPost.

"Yes, approximately 4% of all Facebook users have identified their ideology, but the thing to keep in mind is that we were not interested in effects for all users," he said. "We wanted to measure [the] extent to which people are exposed to viewpoints. People need to have viewpoints to do that. Furthermore, this was only a study on 18-year-olds and older. Furthermore, we were interested in people active on Facebook, or the people who have the potential to scroll down. Facebook will show you all stories from friends as long as you scroll down the way. Content is only in the newsfeed for some amount of time. If people are only logging in once a week, you can't even consider them. This is why we considered people logging in more than once a week. Once we consider this population, then the percentage of people is much higher."

In addition, Bakshy expressed surprise that publishing information in supplemental materials would be viewed as “hiding" them. The study's supplemental notes indicate that the research was not representative of all users of the network. It's also just about Facebook: We shouldn't extend its conclusions to other social networks, like Twitter, Pinterest or Tumblr.

“This is one of the most visible journals on the planet," Bakshy said. "One summarizes the result in the paper and puts the evidence in the supplemental material. The dataset contains all sorts of facts that are not in the paper. We would like other researchers to engage in this work.”

Bakshy added that a lot of the critics have taken the goals of the paper out of context.

“We're very clear about what the population is,” he said. “We have distributed the paper to many colleagues inside and outside of Facebook. Nobody has criticized the population before. A lot of the criticism has come from a small number of academics who are not necessarily political scientists. If you read the blog post, that's what's intended for public audience or members of the press. We're very clear about the meaning: It's about what content people are exposed to."

Bakshy strongly emphasized the independence of the research and its conclusion. “A lot of this is an extension of work I started when a Ph.D student at the University of Michigan,” he said, referring to a paper he wrote on strong and weak social ties and information diffusion. “We are more likely to be exposed to information through weak ties that's more dissimilar to our views. That stands in opposition to the echo chamber idea. This project was started independently. The study was not commissioned by Facebook.”

Christian Sandvig, professor of communications at the University of Michigan, was also sharply critical of the way the study was presented, calling it Facebook’s "It's Not Our Fault" Study.

"Facebook is a private corporation with a terrible public relations problem. It is periodically rated one of the least popular companies in existence," he wrote. "It is currently facing serious government investigations into illegal practices in many countries, some of which stem from the manipulation of its news feed algorithm."

"In this context," Sandvig continued, "I have to say that it doesn’t seem wise for these Facebook researchers to have spun these data so hard in this direction, which I would summarize as: the algorithm is less selective and less polarizing. Particularly when the research finding in their own study is actually that the Facebook algorithm is modestly more selective and more polarizing than living your life without it."

The reason this criticism matters isn't just academic. We need to understand how these networks work and how to use them in a way that's healthy for us and our communities. More and more of how we live, work, play and govern has become mediated by software: Algorithms are powerful.

"They mediate more and more of what we do," Eli Pariser, the author of "The Filter Bubble" and co-founder of Upworthy.com, wrote on Medium. "They guide an increasing proportion of our choices -- where to eat, where to sleep, who to sleep with, and what to read. From Google to Yelp to Facebook, they help shape what we know." This gives the humans who create, modify, and maintain those algorithms enormous power. Humans built and maintain Facebook's newsfeed algorithm, just as humans built and maintain Google's search algorithms."

Bakshy insisted that the power remains with users. “People have choices in [the] set of stories that they see,” he said. “If they want to consume more from opposite side, they can."

Close

What's Hot