Journals requiring more transparency
ListenA bogus University of Pennsylvania study helped jump-start a movement toward additional data disclosure.
The focus on so-called “p-hacking” and problems with research methodology in science, psychology and medicine has grown significantly in recent years, jump-started, in part, by an outrageous University of Pennsylvania study published three years ago.
“The attention that it’s received has been gigantic,” said Psychological Science editor Eric Eich.
“Not just in psychology by any means, but in cancer research, pharmaceuticals, particle physics, a whole variety of other areas,” Eich said.
We reported on the original study in December 2012, which demonstrated common methodological problems by describing how they arrived at the bogus finding that “When I’m 64” made listeners younger.
The participants who listened to the Beatles song during the experiment happened to be younger than a group that listened to a second song, but the study data was manipulated until the “p-value,” a common measure of statistical significance, showed that the song was effectively causing the age difference. (For a fuller explanation of the p-value and how this study worked, go back to our original story.)
Now, more journals are adopting data disclosure requirements suggested by the University of Pennsylvania researchers, in an attempt to discourage cherry-picking of “significant” findings.
This January, the Psychological Science began requiring authors to provide additional data when submitting articles for peer review, including excluded observations and independent and dependent variables. If the “When I’m 64” study had been subjected to these standards, for example, the fact that study participants heard “Hot Potato” would have been reported, as would questions about their mother’s ages, father’s ages, and whether they would go to an early bird special, all discarded variables that helped produce a “significant” finding.
Psychological Science authors are also being asked to explain how they chose their sample size, and are encouraged to pre-register their research methodology, open up data to other researchers, and move away from using the p-value as a measure of significance in favor of so-called “new statistics” that provide a fuller picture of study findings.
Eich said article submissions plummeted the first month his journal required the additional data reporting, but they “zoomed back up,” and this April they were tied for the highest level ever.
“People say they specifically are sending (their research) to us now because of these changes that were made,” Eich said, because the extra disclosure requirements add credibility.
Elsewhere in academia, The University of Virginia launched the Center for Open Science in the spring of 2013, which incentivizes reproducing psychology studies and provides an open-data platform for research data.
Wharton professor Joe Simmons, an author of the original “When I’m 64” paper, named a handful of other social science journals that have developed reporting checklists based on the recommendations he and his team published, similar to those in Psychological Science. He thinks others will follow suit.
“You don’t want to be known as the false journal, you don’t want to be known as the journal that has lower standards,” Simmons said. “So I think it’s going to keep improving.”
Outside of psychology, the influential journal Nature launched a data disclosure checklist last April, and this summer Science announced it is adding an additional layer of statistical checks to its peer review process. The National Science Foundation has also launched a committee on believability and replicability.
WHYY is your source for fact-based, in-depth journalism and information. As a nonprofit organization, we rely on financial support from readers like you. Please give today.