Ep 165 (video): Psychological Research Under Fire: What Can We Do About It?

MichaelCritical Thinking, Research and Stats7 Comments

What’s going on with Psychology? There have been a number of reports about poorly conducted or completely fraudulent research in the field such as this one that appeared in the New York Times. Well, there’s bad research in all fields, but psychology, which has through its history struggled for scientific credibility, is particularly sensitive to this issue and many psychologists have come out with strong recommendations to make sure that our research is of the highest quality.


In this episode I look at how research can be conducted poorly and what to watch out for when you either conduct or read about the results of research.

Simmons, J.P., Nelson, L.D. & Simonsohn, U. (2011). False-Positive Psychology: Undisclosed Flexibility in Data Collection and Analysis Allows Presenting Anything as Significant.

Simmons et. al Requirements for Authors

  1. Authors should decide the rule for terminating data collection before data collection begins and report this rule in the article
  2. Authors must collect at least 20 observations per cell or else provide a compelling cost-of-data-collection justification
  3. Authors must list all variables collected in a study
  4. Authors must report all experimental conditions including failed manipulations
  5. If observations are eliminated, authors must also report what the statistical results are if those observations are included
  6. If an analysis includes a covariate authors must report the statistical results of the analysis without the covariate.

How scientists fool themselves – and how they can stop

Psychology Rife with Inaccurate Research Findings

www.psychologytoday.comNovember 16, 2011 12:02:34 AM EST

The case of a Dutch psychologist who fabricated experiments out of whole cloth for at least a decade is shining a spotlight on systemic flaws in the reporting of psychological research.

Diederik Stapel, a well-known and widely published psychologist in the Netherlands, routinely falsified data and made up entire experiments, according to an investigative committee.

Fraud Scandal Fuels Debate Over Practices of Social Psychology


Related Posts Plugin for WordPress, Blogger...

7 Comments on “Ep 165 (video): Psychological Research Under Fire: What Can We Do About It?”

  1. I wonder if I’ve understood you correctly. Maybe you could help explain it through an experiment example?

    Say that my name is B.F Skinner and I’ve just figured out during my reinforcement studies that if I reinforce my pigeons in a variable interval schedule they respond in a more steady rate and are highly resistant to extinction. This would be a finding that would mean i had to alter my initial conditions in order to study more extensively on this phenomenon, how would I proceed to make sure my experiments are done correctly?

  2. André, I can’t really follow what you’re asking. First of all, Skinner didn’t use a standard method with a specific number of subjects, different conditions and statistical analyses between them in a rigorous way. He rather just worked with his animals and basically tried to train them, then reported on this.

    Anyhow, could you maybe elaborate a little or chose a different example?

  3. The elimination of observations is why it took so long to discover the hole in the ozone layer. When the British Antarctic Survey discovered the hole, NASA argued against it saying the hole had not been detected by their satellites. When NASA re-checked the raw data they realisid that the program had been set to reject any readings outside of a certain parameter, and these were the readings that would indicate a hole. When the restriction was removed, the data agreed with what the British Scientists had found.

  4. What are you arguing for here? There is nothing wrong with data mining on your own and then doing research to confirm suspicions. The problem is if this data mining is used as proof of something itself.

  5. AJ: I agree. “Data mining” is one way to discover relationships we might not have thought of. The problem comes when researchers build a publication around this discovery and don’t make it clear that the finding was indeed the result of this “mining” effort.

  6. v good. raises some important points that are not well known to public….either psych and certainly not stats. there is a fine line between ‘phishing’ for the result and trying to squeeze out what’s important, and it happens too often that the former is the case, for exactly the reason you state about publishing .. maybe it’s my naivety, as I’m not ‘in the trenches’.

  7. It was a DUTCH psychologist, who fabricated evidence – not a DANISH psychologist. And yes – i’m from Denmark 🙁

Leave a Reply

Your email address will not be published. Required fields are marked *