157
Harvard Scholar Who Studies Honesty Is Accused of Fabricating Findings
(www.nytimes.com)
Breaking news from around the world.
News that is American but has an international facet may also be posted here.
These guidelines will be enforced on a know-it-when-I-see-it basis.
For US News, see the US News community.
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
The high rate of failure to replicate is not, in and of itself, evidence of fraud. It's primarily a problem with low power to detect plausible effects (ie small sample sizes). That's not to say there isn't much deliberate fraud or p-hacking going on, there's far too much. But the so-called replication crisis was entirely predictable without needing to assume any wrongdoing. It happened primarily because most researchers don't fully understand the statistics they are using.
There was a good paper published on this recently: Understanding the Replication Crisis as a Base Rate Fallacy
And this is a nice simple explanation of the base rate fallacy for anyone who can't access the paper: The p value and the base rate fallacy
tl;dr p<0.05 does not mean what most researchers think it means
The Harvard scholar is being accused of deliberately fabricating study results by changing data in a spreadsheet on at least one of the studies.
I think the other commenter mentioned lack of replicability because that's often one of the first indications that the original research results were fraudulent. Inability to reproduce will cause people to go digging through the original data, which is how this stuff gets found in many cases.