Main result of Facebook emotion study: less trust in Facebook

A new study that manipulated emotional messages on Facebook gets a big thumbs-down from Facebook users, and may also amplify a public distrust of behavioral research that has been fed by decades of deceptive laboratory studies.

S. Egts

Psychologists secretly toyed with Facebook users’ emotions in 2012, published their findings last month and got scorched by a social media firestorm they never saw coming.

What goes around comes around.

The researchers wanted to see if emotions spread through online social networks. Apparently, negative emotions spread really fast on the Internet. Congratulations, guys, you’re on to something.

Public anger has appropriately focused on scientists’ ethical breach in covertly trying to manipulate people’s moods. A team led by social psychologist and Facebook data scientist Adam Kramer altered the emotional content of postings in daily news feeds — the primary forums for seeing what one’s Facebook friends have posted — during one week for 0.04 percent of users. That’s 698,003 individuals.

When friends’ positive posts were surreptitiously weeded out of news feeds to varying extents, people wrote slightly fewer positive posts and more negative posts. The reverse occurred when friends’ negative posts were unknowingly removed. In both groups, people produced an average of one fewer emotional word, per thousand words, over the following week. This effect was statistically small but could have big consequences across the many interpersonal connections in a massive social web, Kramer’s team concluded in the June 17 Proceedings of the National Academy of Sciences.

By creating an account, Facebook users endorse an online statement that gives the site permission to use their personal information for research. To the researchers, that constituted informed consent for the study.  

There’s much to be disturbed by here. No one knows how many people read the online statement or, if they did, understood its implications. The definition of informed consent for members of online communities has barely been addressed by ethicists and scientists.

Academic panels that assess the ethics of research on people, called institutional review boards, offer no easy answers for digital investigators. IRBs have yet to develop guidelines for obtaining informed consent in online studies. And it’s not clear whether university IRBs can regulate collaborations between university scientists and commercial enterprises. Even the journal that published the new study agrees that it is “a matter of concern that the collection of the data by Facebook may have involved practices that were not fully consistent with the principles of obtaining informed consent and allowing participants to opt out.”

An unannounced change to the digital code controlling what gets posted on Facebook users’ news feeds may be an “implicit violation” of the site’s contract with users who expect something else entirely, says psychologist Ralph Hertwig of the Max Planck Institute for Human Development in Berlin. Some users may view news feeds as random collections of recent posts. Others might regard news feeds as the “best” new posts. That would be worth studying.

Hertwig has long criticized social psychologists’ penchant for deceiving college students and others in the name of science (SN Online: 10/22/10). The most infamous such experiment occurred more than 50 years ago, when volunteers administered what they thought were real electrical shocks to an unseen person who wasn’t really shocked but could be heard screaming in mock agony. A 2010 study had participants complete a fake questionnaire. Experimenters falsely told volunteers they had expressed a preference for counterfeit products and then gave them expensive sunglasses labeled as counterfeits. The researchers’ aim was to show that people who wear knock-off items feel phony and become more likely to cheat.  

Students who are tricked in lab experiments and debriefed afterward — as required by the American Psychological Association — frequently lose their trust in researchers and spread the word to other potential research subjects about psychologists’ devious ways, Hertwig argues. Psychologists end up not knowing whether they’re manipulating study participants or getting played by them.

Trust in researchers gets tainted on a much larger scale by well-intentioned ruses pulled on massive online communities, Hertwig says. Fallout from the new Facebook study may cause users to drop their accounts, monitor the content of their posts, refuse to participate in future studies involving no trickery and otherwise make life difficult for investigators. Kramer’s paper doesn’t address this issue.  

Concerns that the researchers actually altered Facebook users’ emotions are misplaced, Hertwig adds. The statistical effect of their manipulation on the number of positive and negative words in posts is “ridiculously small.” And there is no evidence that what they did changed social networks or altered anyone’s emotional state — at least until people realized that they had been hoodwinked.

At a time when psychologists are admirably trying to improve their statistical and research practices, the new Facebook study hammers home the need to think carefully about how deceptive investigations can corrode public trust in science.

Follow me on Twitter: @Bruce_Bower

Bruce Bower has written about the behavioral sciences for Science News since 1984. He writes about psychology, anthropology, archaeology and mental health issues.

More Stories from Science News on Science & Society

From the Nature Index

Paid Content