[As Facebook gobbles up more & more of folk's time, this article hints at the possible scope of widespread cultural manipulation bought & paid for by your friendly PTB — chris]
By Robinson Meyer/The Atlantic
The Atlantic Article HERE:
Facebook’s News Feed—the main list of status updates, messages, and photos you see when you open Facebook on your computer or phone—is not a perfect mirror of the world.
But few users expect that Facebook would change their News Feed in order to manipulate their emotional state.
We now know that’s exactly what happened two years ago. For one week in January 2012, data scientists skewed what almost 700,000 Facebook users saw when they logged into its service. Some people were shown content with a preponderance of happy and positive words; some were shown content analyzed as sadder than average. And when the week was over, these manipulated users were more likely to post either especially positive or negative words themselves.
This tinkering was just revealed as part of a new study, published in the prestigious Proceedings of the National Academy of Sciences. Many previous studies have used Facebook data to examine “emotional contagion,” as this one did. This study is different because, while other studies have observed Facebook user data, this one set out to manipulate it.
The experiment is almost certainly legal. In the company’s current terms of service, Facebook users relinquish the their data “data analysis, testing, [and] research.” Is it ethical, though? Since news of the study first emerged, I’ve seen and heard both privacy advocates and casual users express surprise at the audacity of the experiment.
We’re tracking the ethical, legal, and philosophical response to this Facebook experiment here. We’ve also asked the authors of the study for comment.
Author Jamie Guillory replied and referred us to a Facebook spokesman. Early Sunday morning, a Facebook spokesman sent this comment in an email:
“This research was conducted for a single week in 2012 and none of the data used was associated with a specific person’s Facebook account. We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible. A big part of this is understanding how people respond to different types of content, whether it’s positive or negative in tone, news from friends, or information from pages they follow. We carefully consider what research we do and have a strong internal review process. There is no unnecessary collection of people’s data in connection with these research initiatives and all data is stored securely.”
What did the paper itself find? The study found that by manipulating the News Feeds displayed to 689,003 Facebook users users, it could affect the content which those users posted to Facebook. More negative News Feeds led to more negative status messages, as more positive News Feeds led to positive statuses.
As far as the study was concerned, this meant that it had shown “that emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness.” It touts that this emotional contagion can be achieved without “direct interaction between people” (because the unwitting subjects were only seeing each others’ News Feeds). The researchers add that never during the experiment could they read individual users’ posts. [Yeah, sure—whatever you say...]
Two interesting things stuck out to me in the study.
REST COVERAGE OF FB PSYOP HERE:
By Robinson Meyer/The Atlantic
The Atlantic Article HERE:
Facebook’s News Feed—the main list of status updates, messages, and photos you see when you open Facebook on your computer or phone—is not a perfect mirror of the world.
But few users expect that Facebook would change their News Feed in order to manipulate their emotional state.
We now know that’s exactly what happened two years ago. For one week in January 2012, data scientists skewed what almost 700,000 Facebook users saw when they logged into its service. Some people were shown content with a preponderance of happy and positive words; some were shown content analyzed as sadder than average. And when the week was over, these manipulated users were more likely to post either especially positive or negative words themselves.
This tinkering was just revealed as part of a new study, published in the prestigious Proceedings of the National Academy of Sciences. Many previous studies have used Facebook data to examine “emotional contagion,” as this one did. This study is different because, while other studies have observed Facebook user data, this one set out to manipulate it.
The experiment is almost certainly legal. In the company’s current terms of service, Facebook users relinquish the their data “data analysis, testing, [and] research.” Is it ethical, though? Since news of the study first emerged, I’ve seen and heard both privacy advocates and casual users express surprise at the audacity of the experiment.
We’re tracking the ethical, legal, and philosophical response to this Facebook experiment here. We’ve also asked the authors of the study for comment.
Author Jamie Guillory replied and referred us to a Facebook spokesman. Early Sunday morning, a Facebook spokesman sent this comment in an email:
“This research was conducted for a single week in 2012 and none of the data used was associated with a specific person’s Facebook account. We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible. A big part of this is understanding how people respond to different types of content, whether it’s positive or negative in tone, news from friends, or information from pages they follow. We carefully consider what research we do and have a strong internal review process. There is no unnecessary collection of people’s data in connection with these research initiatives and all data is stored securely.”
What did the paper itself find? The study found that by manipulating the News Feeds displayed to 689,003 Facebook users users, it could affect the content which those users posted to Facebook. More negative News Feeds led to more negative status messages, as more positive News Feeds led to positive statuses.
As far as the study was concerned, this meant that it had shown “that emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness.” It touts that this emotional contagion can be achieved without “direct interaction between people” (because the unwitting subjects were only seeing each others’ News Feeds). The researchers add that never during the experiment could they read individual users’ posts. [Yeah, sure—whatever you say...]
Two interesting things stuck out to me in the study.
REST COVERAGE OF FB PSYOP HERE: