Over the past two weeks, an important debate has taken place about the ethics of a study published in the Proceedings of the National Academy of Science by researchers at Facebook Data Science and Cornell University. In the study, researchers manipulated some parameters in news feeds to evaluate how the changes influenced readers’ moods as defined by their subsequent posts. While it is easy to get lost in the weeds of this debate, the controversy has raised significant questions about the role of corporations like Facebook in the production of public science.
For the record, I’m a Facebook partisan. I have written papers with people from the Data Science team, have two Ph.D. students doing internships at Facebook this summer, and believe that Facebook provides a valuable service to people. That’s not to say that it never make mistakes, or that we shouldn’t question Facebook’s power to construct the architecture of so much human communication.
Among many specific concerns, some commentators worry about the consolidation of so much social-science data in the hands of one corporation. Others have called on Facebook and others to conduct their research in accordance with common techniques in academe.
I’m more concerned, however, that Facebook, burned by sensationalistic media and the hasty condemnations of some scholars, will decide to stop doing public-facing research. That would be a real blow to science.
Why do companies like Facebook participate in the production of public science? Isn’t the safe route to conduct internal tests and not participate in academic publishing? Probably, but here’s the thing: Corporate researchers come out of the same university programs as academics. Publishing is baked into their genetic code, and companies like Facebook know that to attract really smart Ph.D.’s, it helps to provide the opportunity to publish in scientific outlets.
Furthermore, corporations see academic publishing as a way to give back. I know that may seem hopelessly naïve, but in conversations I’ve had with people at Facebook and elsewhere, researchers speak about sharing insights as a way of being good citizens. I’m sure the legal and PR factions in such corporations feel a twinge at the risk versus reward, but many in the private sector regard science as a good thing and believe that contributing to it helps the world. This point has been almost entirely lost in the coverage about Facebook’s emotional-contagion study. Many people assume that any science done by Facebook must have a selfish motivation.
Simply put, corporate participation in public science is hugely valuable. How so?
1. Corporations allow us to test the validity of our theories. There is no corollary to Facebook in the academic world. Either in our labs or in the limited systems we’ve been able to create to test people in the field, we can’t approach either the diversity or validity of interactions among people on sites like Facebook.
2. Public science increases public understanding of social media. Part of the reason some people were upset by this Facebook study was that they weren’t aware of the algorithms already embedded in the news feed. More public understanding about the power of algorithms is a good thing.
3. Corporations provide an opportunity to put social science into effect. Early social scientists believed that understanding humans could help create a better world. Now, except for some of our friends in economics, scholars often eschew the practical outcomes of social-science research. Facebook and similar companies—by directly using social-science research to create systems that mediate and supplement communication—provide opportunities for those scholars who see a value in mixing pure science and practical effects.
4. Corporations provide support for social science when public funding is waning. Companies like Facebook are not only investing in social science but also providing valuable, meaningful jobs for university-trained researchers. (Which is not to say we should give up on public funding of social science.)
For those reasons, I think Facebook should be lauded for participating in public science. But will that resolve hold up to the attacks against it?
Either way, there are several things we should do to clarify the ethics of social-media research. Besides the opt-in panels that allow participants to agree to being studied, which would certainly work in many cases, corporations could employ external review boards, which some already do. We can advocate for changes in journals and proceedings to do something deeper than pro forma statements that authors have met the standard of IRB review, and we can work with our own IRBs to reconsider how we use secondary data.
Some companies are experimenting with user-rights panels, comprised of nonemployees who use their services. Facebook has already instituted many changes over the past several years to provide more oversight of the research that is done there. In my own research, several people from different parts of Facebook review new research protocols and provide feedback, independent of my own university’s IRB processes. Often, lawyers and code experts in industry can provide more-expert reviews than university IRBscan .
That said, I think the latest controversy will have a chilling effect not just at Facebook but also at other companies watching this saga unfold. That has happened in the past. In 2006, AOL had a privacy breach related to data shared with researchers. The breach was used as a reason for multiple companies to stop sharing data with academic researchers.
From Facebook’s perspective, the attacks from some academics have been particularly disturbing. Part of that is culture clash. Academics thrive on, and are oriented to, critique. We are independent operators who delve into conflict as a way of shedding light on issues. Corporate environments are based much more on consensus and collaboration; some groups in corporations can see the culture of critique as destructive rather than constructive.
I hope that Facebook responds to this controversy by continuing to reflect on its research practices, and by continuing its commitment to public science. If Facebook and other companies abandon scholarly publishing, we will have lost a remarkable opportunity to advance social science. Rather than moving the debate on ethics forward, we will have shut it down.
Author Bio: Clifford Lampe is an associate professor of information at the University of Michigan at Ann Arbor.