"The ethical problem with Facebook's emotional experiment"
will start after this message from our sponsors.
The ethical problem with Facebook's emotional experiment
Life is just one big experiment called Facebook.
I'm Bridget Carey, and this is your CNet update.
The world has yet another reason to distrust Facebook.
More than 680,000 Facebook users have been subjected to a psychological experiment.
And it was done, without the knowledge of the participants.
The network set out to see if it could manipulate user's emotions, and it worked with Academia, to conduct a week long test, back in January of 2012.
Those chosen for the experiment either saw only positive posts in their feed, or they only saw posts with negative words.
The study showed that if you saw mostly happy words, you were a little more positive in what you shared in your status update.
And you're slightly more negative, when you're subjected to a week of doom and gloom.
Now, we learned of this because the study was published in a science journal.
And adding to the ethical problems, the study was funded by the government.
Facebook was toying with people's mental health.
To learn if advertisements could take advantage of our moods?
If you aren't posting much, well Facebook cranked up the kittens to get you happy again.
With any study, the typical ethical practice is to get informed consent from your subjects, so people know they're part of an experiment.
Facebook says it was fair game to test us because of one word, hidden deep in the data use policy.
You know, that document that none of us read.
It states that your information may be used for internal operations and research.
When I think of research in that context, I think of Facebook wanting to study where I click, not that I want Facebook to use me as a lab rat to see if it can make me sad.
I wouldn't be surprised if an organization, perhaps the Federal Trade Commission, goes after Facebook for not being clear over how this research data was being used.
But going forward, if enough people speak out and complain, or shy away from using Facebook, well then, it could take a hint, and it might be more transparent about future experiments.
The network is on shaky ground as it is now with public trust.
And even if you don't care to be tested on, you should care about this.
Because Facebook is collecting health information.
Facebook owns the fitness tracking app called Moves, and as fitness tracking bands and smart watches grow in popularity, we have to wonder, what sort of health information will Facebook have on us, and can we trust what it does with that data?
Well let's switch gears to Google's social network.
No, not Google Plus, not that Buzz thing.
I mean Google's first social network.
It's called Orkut, and after ten years, Google is shutting it down.
Orkut was really big in Brazil and India, but it'll close on September 30th.
Google says it rather focus it's energy on YouTube, Blogger, and Google Plus.
That's your tech news update.
You can always get more details on these stories, at cnet.com.
From our studios in New York, I'm Bridget Carey.
Download Netflix shows to watch offline
Amazon's next Echo said to come with a screen
Curved iPhone 8? Apple said to be exploring OLED screens
Black Friday and other turkey traditions are evolving
Facebook drone accident under investigation
Facebook needs you to fight fake news
Airbnb wants to be your travel agent
Wait, how fast can Qualcomm charge a phone?
Snapchat may be worth $30 billion with IPO filing
Nintendo puts a price on Super Mario Run (and the Switch?)