Backlash over Facebook’s ‘unethical’ secret study

0
163

Facebook secretly manipulated the feelings of 700,000 users to understand “emotional contagion” in a study that prompted anger and forced the social network giant on the defensive.

For one week in 2012 – and without the explicit consent or knowledge of users – Facebook tampered with the algorithm used to place posts into their news feeds to study how this affected their mood.

The researchers wanted to see if the number of positive or negative words in messages they read affected whether users then posted positive or negative content in their status updates.

The study, conducted by researchers affiliated with Facebook at Cornell University and the University of California at San Francisco, appeared in the June 17 edition of the Proceedings of the National Academy of Sciences.

Results of the study spread when the online magazine Slate and The Atlantic website wrote about it on Saturday.

“Emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness,” the study’s authors wrote.

“These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks.”

While other research has used metadata to study trends, this appears to be unique because it manipulated the data to see if there was a reaction.

Methods ‘consistent with data use policy’: researchers

Facebook, which says that it has more than 1 billion active users, said in a statement: “This research was conducted for a single week in 2012 and none of the data used was associated with a specific person’s Facebook account.”

“We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible.

“A big part of this is understanding how people respond to different types of content, whether it’s positive or negative in tone, news from friends, or information from pages they follow.

“We carefully consider what research we do and have a strong internal review process.”

In the paper, the researchers said the study “was consistent with Facebook’s Data Use Policy, to which all users agree prior to creating an account on Facebook”.

David Vaile, co-convenor of the Cyberspace Law and Policy Community at the University of New South Wales law faculty, says while not illegal, the experiment was unethical.

“Any sort of university or established researcher would typically have to take this to an ethics panel and get ethics clearance and first thing they’d ask is ‘where’s the informed individual consent?’,” Mr Vaile told ABC’s AM program.

“So the issue here as well as not being informed consent for the individuals, is also the question of research ethics – that you know they’re messing with real people’s lives.

“And let’s not forget that some negative experience[s] online have real consequences for people.

“At worst case people get desperate or … end up feeling depressed; people have committed suicide from terrible things that have happened to them online.”

Users angered, unsurprised by Facebook experiment

Some Facebook users took to social media to expressed their response to the experiment, using the hashtag #FacebookExperiment.

ABC/AFP