Presentation is loading. Please wait.

Presentation is loading. Please wait.

Facebook’s Mood Manipulation Experiment Darren King.

Similar presentations


Presentation on theme: "Facebook’s Mood Manipulation Experiment Darren King."— Presentation transcript:

1 Facebook’s Mood Manipulation Experiment Darren King

2 The Facts In January 2012 Facebook researchers carried out an experiment on 689,003 randomly selected users for a period of one week. The experiment was conducted without prior consent from these randomly selected users. These users had the information that appeared in their newsfeeds altered by Facebook.

3 The Experiment In January 2012, 689,003 randomly selected users had their Facebook newsfeeds altered. Some users had more negative information placed in their timelines while others received more positive information. These users’ Facebook posts were then monitored for one week by a piece of software known as the Linguistic Inquiry and Word Count Software word counting system (LIWC 2007).

4 The LIWC 2007 monitored positive and negative words in the users Facebook posts. “Posts were determined to be positive or negative if they contained at least one positive or negative word” [Kramer, Guillory and Hancock (2014)]. Each positive word received a +1 score on the positive scale and each negative word received a +1 score on the negative scale.

5 Flaws in the experiment There were some limitations in the scoring system used for the experiment. The software would not be able to understand the concept of sarcasm or the context of a certain statement. It could only count the positive or negative words. Eg: “I am not having a great day”. To a normal person this statement would be a negative one. However the LIWC would give a +1 score on the positive scale for ‘great’ and a +1 on the negative scale for ‘not’. This would result in the scores cancelling each other out and the statement would become neutral, which is incorrect.

6 What was the reason for the experiment? The main purpose of this experiment was to investigate whether “emotional contagion” [Kramer, Guillory and Hancock (2014)] was possible through the medium of social networks. i.e. that a person’s mood could be manipulated by the content that they saw in their Facebook newsfeed.

7 The Results In total over three million Facebook posts were analysed by researchers. The study found that “more negative newsfeeds led to more negative status messages [and] more positive newsfeeds led to more positive statuses” [Meyer (2014)]. It also found that when researchers reduced the amount of positive or negative information in the selected users’ newsfeeds those people reduced the amount of words that they posted on Facebook.

8 The research paper was published in March of 2014 Title: Experimental evidence of massive-scale emotional contagion through social networks. Authors: Adam D.I. Kramer (Facebook Data Scientist), Jamie E. Guillory(Centre for Tobacco Control Research) Jeffrey T. Hancock (Departments of Communication and Information Science at Cornell University). The Results

9 Legal Implications The experiment divided opinion when its results were published. Some were fascinated by the idea of emotional contagion while others were upset that Facebook had invaded their privacy. Several legal and ethical issues raised surrounding the experiment Major legal issue being whether Facebook were allowed manipulate user timelines without prior user consent. Comment on website where the research paper was published

10 Facebook defended their experiment by claiming that users gave consent by agreeing to the site’s data use policy when they first signed up to use the site. This data use policy states that Facebook “may use the information that we receive about you…for internal operations, including troubleshooting, data analysis, testing, research and service improvement” [Solove (2014)]. However, opposition to the experiment claim that the data use policy allows observation of existing user behaviour and does not allow the manipulation of data that Facebook users see. User Consent

11 Defence of the experiment While many in the general public were opposed to the experiment a few came out in its defence. “It's not clear what the notion that Facebook users' experience is being 'manipulated' really even means, because the Facebook news feed is, and has always been, a completely contrived environment.“ Tal Yarkoni (Lab Research Assistant Professor in the University of Texas). [Solove (2014)] Yarkoni also points out that the "items you get to see are determined by a complex and ever-changing algorithm”. [Solove (2014)] Yarkoni’s comments show that data in users’ newsfeeds is constantly being manipulated and so the public argument about the experiment would appear to be quite harsh on Facebook.

12 Ethical Implications An ethical question was also raised surrounding Facebook’s experiment: Had Facebook been given permission by an independent ethics review board to carry out their experiment? The academic paper was met with strong disapproval from a lot of people when it was published in March of 2014. This was mainly due to the fact that the experiment involved an invasion of Facebook users’ privacy. Many people began to question how Facebook had managed to get approval for the study or if they had gotten any approval at all.

13 It later emerged in June of 2014 that “the experiment was conducted before an IRB [Institutional review board] was consulted” [Meyer (2014)]. The researchers put forward the experiment to the Cornell University Institutional review board for ethical approval after they had conducted their experiment.

14 However Cornell University concluded that, once the researchers could not see individual users confidential data, the experiment “was not directly engaged in human research and that no review by the Cornell Human Research Protection Program was required” [Meyer (2014)]. Co-author of the paper Jamie Guillory confirmed in an email to Meyer that “none of the data used was associated with a specific person’s Facebook account” [Meyer (2014)]. Thus it would appear that Facebook did not break any ethical laws. However the experiment is currently under investigation in the UK to determine whether an ethical laws were broken and whether legal action against Facebook is required [Webber &Reilly (2014)].

15 Conclusion The results of Facebook’s experiment proved that emotional contagion was indeed possible through the medium of social networks. However, the experiment provoked widespread criticism of Facebook and its Data Use Policy. It also raised both legal and ethical implications which may lead to legal action against Facebook. Co-Author of the paper, Adam D.I Kramer comments that “In hindsight, the research benefits of the paper may not have justified all of this anxiety” [Meyer (2014)].

16 References Ref 1: Adam D.I. Kramer, Jamie E. Guillory and Jeffrey T. Hancock (2014). Experimental evidence of massive-scale emotional contagion through social networks. http://www.pnas.org/content/111/24/8788.fullhttp://www.pnas.org/content/111/24/8788.full Ref 2: Robinson Meyer (2014). Everything we know about Facebook’s secret mood manipulation experiment. http://www.theatlantic.com/technology/archive/2014/06/everything-we-know- about-facebooks-secret-mood-manipulation-experiment/373648/ http://www.theatlantic.com/technology/archive/2014/06/everything-we-know- about-facebooks-secret-mood-manipulation-experiment/373648/ Ref 3: Daniel J. Solove (2014). http://www.huffingtonpost.com/daniel-j-solove/facebook-psych- experiment_b_5545372.html http://www.huffingtonpost.com/daniel-j-solove/facebook-psych- experiment_b_5545372.html Ref 4: Harrison Webber, Richard Byrne Reilly (2014). Facebook’s mood manipulation experiment under investigation in UK. http://venturebeat.com/2014/07/02/facebooks-mood-manipulation-experiment- may-be-under-investigation-in-u-k-ireland/ http://venturebeat.com/2014/07/02/facebooks-mood-manipulation-experiment- may-be-under-investigation-in-u-k-ireland/


Download ppt "Facebook’s Mood Manipulation Experiment Darren King."

Similar presentations


Ads by Google