On Facebook, you control the slant of the news you choose
Site’s social algorithms play second fiddle when it comes to exposure to opposing political views
By Bruce Bower
Don’t totally blame Facebook for worsening political divisions between liberals and conservatives. Those rifts have more to do with the news you — and your online friends — choose.
The social media site’s news-filtering program shuts out some opposing points of view, but not as much as its users do on their own, researchers report online May 7 in Science.
Facebook users often click on news links that confirm their political leanings. Yet they frequently bypass posts given high priority and visibility in their news feeds by the site’s social algorithms, data scientists at Facebook in Menlo Park, Calif., report. As a result, only about one-fifth of posts seen by liberals and less than one-third of posts seen by conservatives express opposing political views. Mathematical rules at the heart of social algorithms — used to anticipate what posts each Facebook member most wants to see based on past clicks — play a modest role in shielding people from political outlooks other than their own, the scientists conclude.