Facebook has announced changes to its research methods, following a broadly condemned emotions study from earlier this year.
That study, which Facebook published in June, triggered backlash from Facebook users and the scientific community alike. Without consenting research subjects, it was said, Facebook violated the ethical guidelines for human research in the U.S.
In a blog post Thursday, Facebook explained it’s introducing changes to the way it does its testing. It will implement stricter, clearer guidelines on research, subject to more thorough review by a panel of Facebook’s research and privacy people. The company will also provide more specific training for researchers and engineers.
But critics around the web are still worried Facebook’s measures are insufficient. The New York Times quotes Marc Rotenberg, president of the Electronic Privacy Information Center. (video via Facebook)
“Tightening up research practices is a step in the right direction. But human subject research requires consent and independent review. It does not appear that Facebook has taken those steps.”
Users also aren’t given a clear method to opt in or out of research, writes PCWorld, “which is essential when you're manipulating a user's experience. The network should make participation in research optional—a privacy setting you can change like any other.”
Instead, ReadWrite suspects Facebook still derives consent from its existing user agreement.
“If people continue to agree to the data use policy, which few people even read, it appears that simply using Facebook is enough of a consent for future research.”
In any case, consent and ethics will likely stay a hot-button issue for the social network as it moves forward, especially if reports such as this one from The Verge are to be believed:
Facebook is said to planning its own health-monitoring app suite, to compete with those of Apple and Google. It’s not clear how the network will handle privacy, though Facebook is said to be planning to debut the service “quietly.”