mobilenapps.com

Facebook let shrinks change users’ news feeds in creepy experiment

Jun 30, 2014 07:25 AM EDT

Facebook apparently allowed researchers to manipulate what users see in their news feeds to see how it influences what they post.

If you thought that your Facebook news feed just showed random stuff based on your preferences or settings, you may have been very wrong. It just came to light that back in 2012 Facebook let psychology researchers manipulate what nearly 700,000 users could see in their news feeds. More specifically, the researchers hid either positive or negative posts in order to determine which would be more emotionally contagious - the cheerful posts or the contagious ones.

Facebook apparently told researchers that manipulating users' news feeds was routine, and that's how the project got the green light despite the obvious ethical concerns.

The creepy research only required users to click on "Agree" for Facebook's terms and conditions, which hardly anyone reads in full anyway.

"The researchers reduced the amount of either positive or negative stories that appeared in the news feed of 689,003 randomly selected Facebook users, and found that the so-called 'emotional contagion' effect worked both ways," reads an article published by Cornell University.

"People who had positive content experimentally reduced on their Facebook news feed, for one week, used more negative words in their status updates. When news feed negativity was reduced, the opposite pattern occurred: Significantly more positive words used in peoples' status updates," explains John Hancock, professor of communication at Cornell's College of Agriculture and Life sciences and co-director of its Social Media Lab.

The U.S. National Academy of Sciences (PNAS) published the study here, describing the methodology and the outcome of this experiment.

"We test whether emotional contagion occurs outside of in-person interaction between individuals by reducing the amount of negative content in the News Feed. When positive expressions were reduced, people produced fewer positive posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks."

The researchers further highlighted that this whole experiment was in line with Facebook's data use policy, and all users agree to that policy before proceeding to create an account on the social network. Consequently, users allegedly gave their informed consent for research. Clicking "agree" on a sheet-long policy, however, is hardly a very informed consent, but it seems that Facebook was banking precisely on that. How many Facebook users would actually agree to having their news feeds modified and their subsequent social media behavior analyzed by shrinks if they were fully informed of the experiment?

© Copyright 2020 Mobile & Apps, All rights reserved. Do not reproduce without permission.
© Copyright 2024 Mobile & Apps All rights reserved.
About Us Contact Us Privacy Policy Terms&Conditions