Researchers from Facebook, Cornell and UCSF published a paper describing a mass-scale experiment in which [689,000] Facebook users’ pages were manipulated to see if this could induce and spread certain emotional states. They say it was legal to do this without consent, because Facebook’s terms of service require you to give consent for, basically, anything.
The Facebook users had the news from their contacts filtered to be all negative news or all positive news.
Legal scholar James Grimmellmann points out, there’s a federal law that prohibits universities from conducting this kind of experiment without explicit, separate consent (none of this burying-consent-in-the-fine-print bullshit). Two of the three researchers who worked on this were working for federally funded universities with institutional review boards, and the project received federal funds.
The Federal Policy for the Protection of Human Subjects, a/k/a the Common Rule, requires that informed consent include:
(1) A statement that the study involves research, an explanation of the purposes of the research and the expected duration of the subject’s participation, a description of the procedures to be followed, and identification of any procedures which are experimental;
(2) A description of any reasonably foreseeable risks or discomforts to the subject; …
(7) An explanation of whom to contact for answers to pertinent questions about the research and research subjects’ rights, and whom to contact in the event of a research-related injury to the subject;
(8) A statement that participation is voluntary, refusal to participate will involve no penalty or loss of benefits to which the subject is otherwise entitled, and the subject may discontinue participation at any time without penalty or loss of benefits to which the subject is otherwise entitled.
PNAS – Experimental evidence of massive-scale emotional contagion through social networks
Michelle Meyer has an extended discussion of whether the Common Rule apples to this research
Even if you don’t buy that Facebook regularly manipulates users’ emotions (and recall, again, that it’s not clear that the experiment in fact did alter users’ emotions), other actors intentionally manipulate our emotions every day. Consider “fear appeals”—ads and other messages intended to shape the recipient’s behavior by making her feel a negative emotion (usually fear, but also sadness or distress). Examples include “scared straight” programs for youth warning of the dangers of alcohol, smoking, and drugs, and singer-songwriter Sarah McLachlan’s ASPCA animal cruelty donation appeal (which I cannot watch without becoming upset—YMMV—and there’s no way on earth I’m being dragged to the “emotional manipulation” that is, according to one critic, The Fault in Our Stars).
Lesson – Just remember that Facebook, advertisers and anyone can filter, spin and lie to you for any reason that they want. Also, your Facebook feeds are subject to the whims of Facebook
If you liked this article, please give it a quick review on ycombinator or StumbleUpon. Thanks
Brian Wang is a Futurist Thought Leader and a popular Science blogger with 1 million readers per month. His blog Nextbigfuture.com is ranked #1 Science News Blog. It covers many disruptive technology and trends including Space, Robotics, Artificial Intelligence, Medicine, Anti-aging Biotechnology, and Nanotechnology.
Known for identifying cutting edge technologies, he is currently a Co-Founder of a startup and fundraiser for high potential early-stage companies. He is the Head of Research for Allocations for deep technology investments and an Angel Investor at Space Angels.
A frequent speaker at corporations, he has been a TEDx speaker, a Singularity University speaker and guest at numerous interviews for radio and podcasts. He is open to public speaking and advising engagements.