On Facebook, you control the slant of the news you choose

Site’s social algorithms play second fiddle when it comes to exposure to opposing political views

facebook image

NEWS SEARCH  Facebook’s news ranking system causes only a slight decline in users’ exposure to opposing political views, a new study finds. Users’ own decisions play a bigger role. 

Tony Webster/Flickr (CC BY 2.0

Don’t totally blame Facebook for worsening political divisions between liberals and conservatives. Those rifts have more to do with the news you — and your online friends — choose.

The social media site’s news-filtering program shuts out some opposing points of view, but not as much as its users do on their own, researchers report online May 7 in Science.   

Facebook users often click on news links that confirm their political leanings. Yet they frequently bypass posts given high priority and visibility in their news feeds by the site’s social algorithms, data scientists at Facebook in Menlo Park, Calif., report. As a result, only about one-fifth of posts seen by liberals and less than one-third of posts seen by conservatives express opposing political views. Mathematical rules at the heart of social algorithms — used to anticipate what posts each Facebook member most wants to see based on past clicks — play a modest role in shielding people from political outlooks other than their own, the scientists conclude.

That finding challenges the idea that the use of social algorithms leads to politically divided, poorly informed citizens, say Eytan Bakshy and his colleagues.

“People do seem to live in ‘echo chambers’ of like-minded opinion, but Facebook’s algorithms are not to blame,” says computational social scientist Sharad Goel of Stanford University.

Bakshy’s group analyzed the online activity of more than 10.1 million U.S. Facebook users who had reported their political affiliation on a five-point scale ranging from “very liberal” to “very conservative.” Politically like-minded people tended to be Facebook friends, although an average of 23 percent of an individual’s online friends had opposing viewpoints, the researchers found.

From July 7, 2014, to January 7, 2015, these Facebook users shared 7 million Web links. About 13 percent of these links concerned news and political opinions. The researchers focused on 226,310 stories that had been shared by at least 20 Facebook members.

Facebook’s news ranking resulted in users clicking on an average of about 1 percent fewer politically challenging posts than their friends shared. Personal choices resulted in Facebook users clicking on an average of around 4 percent fewer politically clashing posts than their friends shared, Bakshy says.

That doesn’t mean that Facebook users completely shield themselves from opposing views. The researchers found that 24 percent of the links shared by liberals’ friends took conservative stands and 35 percent of such links shared by conservatives’ friends made liberal arguments. Readers clicked on almost as much politically challenging news as their friends provided — 22 percent for liberals and 33 percent for conservatives.

The data “give us a much more detailed picture of how news is consumed on Facebook than was previously available,” says economist Matthew Gentzkow of the University of Chicago Booth School of Business. As in face-to-face groups, compatible people congregate on Facebook and share politically congenial news with each other, Gentzkow says.

Facebook’s news rankings may become more influential as the company tweaks its social algorithm, cautions political scientist David Lazer of Northeastern University in Boston. Facebook announced changes on April 21 in how it ranks shared news links, including a new formula for identifying friends that each user cares most about. If those friends are particularly in tune politically with users, exposure to links to opposing views will become rarer, Lazer says in a comment also published online May 7 in Science to accompany the new study.

Researchers outside Facebook also need to examine how people interact on the site, Lazer says. Yet Facebook privacy changes instituted last month will make it harder for independent scientists to collect data from the site, he predicts.

Bruce Bower has written about the behavioral sciences for Science News since 1984. He writes about psychology, anthropology, archaeology and mental health issues.

More Stories from Science News on Psychology

From the Nature Index

Paid Content