The broken escalator; or, can you ever really retract a paper?

\"\"

It’s a clear, curious, irresistible finding. In a study published in March of last year in the Journal of Experimental Social Psychology, researchers tracked donations to the Salvation Army from mall shoppers who had just taken the up escalator versus those who had just stepped off the down. They found that more than twice as many of the recently elevated gave money (16 percent compared with 7 percent).

Articles about the study appeared in Scientific American, New Scientist, and multiple other outlets, each with the obligatory escalator stock photo like the one above. Even though the finding is pretty recent, it’s showed up in several books, including Get Lucky: How to Put Planned Serendipity to Work for You and Your Business and Brainfluence: 100 Ways to Persuade and Convince Consumers With Neuromarketing.

The lead author of the study is Lawrence Sanna, who resigned in May from the University of Michigan at Ann Arbor after another social psychologist, Uri Simonsohn, raised questions about the escalator study and other papers by Sanna. According to Nature, Sanna has requested that that paper and two others be retracted.

But retracting the paper doesn’t mean it will go away. It will probably continue to pop up in Google Scholar searches, just like the papers of the disgraced psychologist Diederik Stapel. Those news articles will still get stumbled on, forwarded, Tweeted. Unsuspecting readers will pick up those books and read aloud the passage about the incredible escalator trick to their spouses. There may be Salvation Army bell ringers standing hopefully at the tops of escalators next Christmas season, counting on the magic of science.

Maybe that isn’t a big deal in this case. It was a harmlessly interesting study. But it goes to show how once a study leaves journal-land for the wider world, there’s really no erasing it. A nifty finding takes on a life of its own, even if it’s flawed or fraudulent.

Simonsohn, who is at the University of Pennsylvania, exposed Sanna and another researcher, Dirk Smeesters, of Erasmus University Rotterdam. Smeesters also resigned after statistical irregularities in his papers were identified. I asked Simonsohn a few questions by e-mail about his investigations:

You said, regarding Sanna’s elevation study, that every result was “super-significant” and that the standard deviations seemed too similar to be believable. You seemed to be saying that there were obvious red flags. Is this something that reviewers should have picked up on?

One of the commonalities in these cases is that one researcher controlled the data. You contacted three of Sanna’s grad students and they said they weren’t involved in collecting data. Stapel also kept the data to himself. Is one of the lessons here that co-authors need to demand access to the data?

Simonsohn said that the response he has received to his investigative work, even from colleagues he thought might be critical, has been entirely supportive.

The English professor Margaret Soltan, on her blog Universities Diaries, has repurposed a term in Simonsohn’s honor to refer to a researcher who gets caught cooking data: Simonized. Let’s hope that catches on.

Leave a Reply

Skip to toolbar