Facebook said Thursday it's increasing scrutiny over future research work it does on the social-media site, following a rash of criticism this past summer over one of its experiments.
Facebook, the biggest social-media company in the world, faced a backlash after researchers from the company and two universities published a paper earlier on their 2012 study thatto find if the changes would affect someone's emotions. They found that if someone sees happy posts from friends, that person's posts in turn tend to be happier, while the opposite was true if someone saw friends writing more negative posts.
While Facebook, which hosts over 1 billion users, has terms of service stating it may gather user information on the site for research, some privacy advocates and users claimed the company crossed a line and altered its users' emotions without first disclosing what it was doing.
"Although this subject matter was important to research, we were unprepared for the reaction the paper received when it was published and have taken to heart the comments and criticism," Facebook Chief Technology Officer Mike Schroepfer said in a post published Thursday. "It is clear now that there are things we should have done differently."
A new panel will do an "enhanced review" for any research on its site relating to content "that may be considered deeply personal (such as emotions)" or if the work focuses on "studying particular groups or populations (such as people of a certain age)," Schroepfer said. Additional review also will be necessary if the research involves a collaboration with someone in the academic community.
The new panel will include senior-level researchers, along with people from Facebook's engineering, research, legal, privacy and policy teams. This panel is in addition to an existing privacy review for products and research. The changes will also include more education and training, as well as a research website to provide one location for Facebook's published academic research.
In Thursday's post, Schroepfer said his company failed to communicate clearly why and how it did the mood study, which he explained stemmed from prior studies suggesting that when people saw positive posts from friends on Facebook, they in turn felt bad. "We thought it was important to look into this, to see if this assertion was valid and to see if there was anything we should change about Facebook," he said.
However, he added, Facebook should have considered non-experimental ways to do its research and the work could've faced more rigorous review.
"We're committed to doing research to make Facebook better, but we want to do it in the most responsible way," Schroepfer said.