Bias in the evidence base

evidenceFrom The British Psychological Society’s Research Digest:

In the last few years the social sciences, including psychology, have been taking a good look at themselves. While incidences of fraud hit the headlines, pervasive issues are just as important to address, such as publication bias, the phenomenon where non-significant results never see the light of day thanks to editors rejecting them or savvy researchers recasting their experiments around unexpected results and not reporting the disappointments. Statistical research has shown the extent of this misrepresentation in pockets of social science, such as specific journals, but a new meta-analysis suggests that the problem may infect the entire discipline of psychology.

A team of psychologists based in Salzburg looked at “effect sizes”, which provide a measure of how much experimental variables actually change an outcome. The researchers randomly sampled the PsycINFO database to collect 1000 psychology articles across the discipline published in 2007, and then winnowed the list down to 395 by focusing only on those that used quantitative data to test hypotheses. For each main finding, the researchers extracted or calculated the effect size.

. . .

The authors, led by Anton Kühberger, argue that the literature is thin on modest effect sizes thanks to the non-publication of non-significant findings (rejection by journals would be especially plausible for non-significant smaller studies), and the over-representation of spurious large effects, due to researchers retrospectively constructing their papers around surprising effects that were only stumbled across thanks to inventive statistical methods.

Read the rest here.

2 thoughts on “Bias in the evidence base

  1. It’s about time psychological research, which might be the area of science that affects our lives most, tried to meet the basic standards of evidence expected of a third-year undergraduate in any other social science.

    Like

  2. Intriguing. Even the term “evidence-based” is slippery. It doesn’t mean proven or verified, it means there is evidence that support the approach. Not much of a statement. Science demands study replication by disinterested third parties.

    Like

Comments are closed.