On Facebook, you control the slant of the news you choose | Science News


Science News is a nonprofit.

Support us by subscribing now.


On Facebook, you control the slant of the news you choose

Site’s social algorithms play second fiddle when it comes to exposure to opposing political views

2:00pm, May 7, 2015
facebook image

NEWS SEARCH  Facebook’s news ranking system causes only a slight decline in users’ exposure to opposing political views, a new study finds. Users’ own decisions play a bigger role. 

Don’t totally blame Facebook for worsening political divisions between liberals and conservatives. Those rifts have more to do with the news you — and your online friends — choose.

The social media site’s news-filtering program shuts out some opposing points of view, but not as much as its users do on their own, researchers report online May 7 in Science.   

Facebook users often click on news links that confirm their political leanings. Yet they frequently bypass posts given high priority and visibility in their news feeds by the site’s social algorithms, data scientists at Facebook in Menlo Park, Calif., report. As a result, only about one-fifth of posts seen by liberals and less than one-third of posts seen by conservatives express opposing political views. Mathematical rules at the heart of social algorithms — used to anticipate what posts

This article is only available to Science News subscribers. Already a subscriber? Log in now.
Or subscribe today for full access.

Get Science News headlines by e-mail.

More from Science News

From the Nature Index Paid Content