We're hardly ever fast to wake up to what might be going on with our data. But once in a while, we're suddenly roused and make a noise.
Such has been the case withof almost 700,000 people in order to see if it would make them happier or sadder, depending on the content presented.
Now, a former member of Facebook's Data Science team has revealed that, for much of its existence since 2007, the team operated with seemingly little supervision.
Andrew Ledvina, who was on Facebook's team from February 2012 to July 2013, told the Wall Street Journal: "There's no review process, per se. Anyone on that team could run a test. They're always trying to alter people's behavior."
This, if true, might make for a profound surprise to those who somehow believed their data was, indeed, their data.
Ledvina suggested that tests were conducted with such regularity that some scientists worried that the same people's data was being analyzed more than once.
Since the controversial study on human emotions, Facebook has reportedly stiffened its procedures. However, since 2007, the Data Science team has reportedly run hundreds of experiments without users' consent or even knowledge.
In 2012, the company created a 50-person panel of experts in areas such as data security and privacy. (The company won't release the names of these experts.) From the beginning of this year, members of this panel have reviewed all research beyond standard product testing.
A Facebook spokesperson said: "We are taking a very hard look at this process to make more improvements."
Clearly some see great benefits in attempting to understand human behavior better through such constant and everyday activity as Facebook posting.
However, after COO Sheryl Sandberg's expressions of regret and reassurance during a TV interview in India, many questions remain.
During the interview, she said: "Facebook cannot control emotions of users. Facebook will not control emotions of users."
However, my understanding of the results of the experiment, conducted by Facebook and researchers at Cornell and UC San Francisco, is that they showed Facebook can manipulate people's moods.
Indeed, the research report said that though the mood changes seemed small,"
This was because "given the massive scale of social networks such as Facebook, even small effects can have large aggregated consequences."
Sandberg also insisted that Facebook does research "in a privacy-protected way." But if you have no idea it's going on, how can you be sure your privacy is being protected?
The pace in which social behavior has changed and moved online has inevitably caused enormous amounts of data to be amassed, often in the hands of very few. Facebook isn't alone in seeking to find truths in that data.
But the potential economic (putting people in a bad mood and then showing them ads for a pick-me-up) and political (skewing news or even moods for one political side or another) dangers are, even if only theoretical, still evident.
The impression given by Ledvina's comments is of a morass of data so inviting to scientists that they paid little regard to the feelings of the people who generated that data.
Perhaps now, though, there might be a greater debate about whether protections need to be far greater than they seem to have been.