As the small uproar over the recent Facebook experiment subsides, there will be an unexpected victim. The nascent public/ private research cooperative where academic researchers work together with private firms will probably die. At least it will go into hibernation and not be seen again for a long time, if ever. Which is a terrible tragedy.
Experiments will still go on. Firms always need to know what marketing strategies work, what practices make the users more engaged, they will continue to experiment on how customers react to changes. All this has always happened and will continue to happen, only like the NSA data collection, we won’t hear about the results. And because of that, we all lose.
This is truly sad, because as a social scientist I can fully appreciate how hard it is for us to collect data. We have to go through a complex IRB process and make sure that the interests of our subjects are highly protected, that they are informed about an experiment and fully debriefed afterwards. This protects the rights of the subjects, but it also makes it more and more difficult to find participants, since many people have grown weary of any research, thinking that some entity will only try to “sell” them something. Also, once they are informed, their behavior changes—they are wary and alert for what is happening. Therefore, it is almost impossible to truly have external validity on any type of laboratory experiment.
However, online behavior presents an interesting space where social scientists can observe and learn. It is a space that can be analyzed (and anonymized) in ways that allow social scientists to observe real behaviors and real interactions in a depth and quantity that was never imagined possible only a few years ago. Recently, the promise of working collaboratively with major companies, using their wealth of data, has the potential to truly further science and understand computer-mediated communications in new ways. This can be done in ways that protects the privacy of the individuals, but this process must be done carefully, since too much meta-data can reveal far more about an individual than most people imagine. It is far more revealing than including their name or other personally identifying information.
This particular experiment, crossed a line however, and this line is what has caused the furor. They purposely manipulated the newsfeed algorithms to measure changes in the emotional tone of the subjects’ posts.
If you haven’t noticed, there already are algorithms in place that manipulate your feed. People that make posts you frequently “like” show up more often. These manipulations make you enjoy reading your newsfeed, and bring you back to Facebook. The goal is you will see the ads, that are also carefully manipulated to match your interests. Imagine what would happen if every time you to came to FB, every posting from every person you have friended was on your newsfeed…overload. Or imagine that irritating co-worker ( that you don’t dare unfriend) constantly showed up on your wall. You probably wouldn’t come back to check things so much. On the other hand, your best friend always posts great stuff, you can hardly wait to see what witty comment they have next. You like all of their postings, and surprise! Guess who shows up most often on your newsfeed.
But I digress…
This particular algorithm change was done to measure the changes in the comments of those whose newsfeeds were manipulated. The participants didn’t know before, during, or after. This is far different from the little surveys that pop up on FB and ask your opinion of something. It was an actual change, an experiment done to measure users’ response, no permission, no IRB, no informing participants.
If FB had kept it quiet, no one outside the organization would have known anything different. As people are finding out about the experiment, there is a growing uproar over being treated like lab rats.
Also, everyone in academia is scrutinizing the findings, the survey design, and pretty much every word will be picked apart (I am SO glad I am not on that research team). I’m sure the FB liability lawyers are pulling their hair and deciding that any future research will have major hurdles to overcome if researchers EVER want to publish anything. That is why we all lose.
There must be better ways to inform participants in a way that will allow them to behave naturally. For instance, researchers could offer $5 in FB credits if they participate in research that may or may not happen in the next few months. This would allow people to be informed, but also the uncertain timing could allow behavior that is more natural. After the time frame, they could be informed about the experiment and see the results if they are interested. This would be a more scientific method that would produce better results AND protect the subjects.
I truly hope that this is not the end of academic/ business research alliances in FB or other computer mediated communication areas. But I do hope that it is the end of uninformed experiments. There are better ways to harness this tremendously important resource.