Send us a link

Subscribe to our newsletter

Practical Tools and Strategies for Researchers to Increase Replicability

Practical Tools and Strategies for Researchers to Increase Replicability

This publication provides an overview of some practical tools and strategies that researchers can implement in their own workflow to increase replicability and the overall quality of psychology research.

 

Are Open Data Actually Reusable?

Are Open Data Actually Reusable?

Many efforts are underway to promote data sharing in psychology, however it is currently unclear if the in-principle benefits of data availability are being realized in practice. In a recent study, we found that a mandatory open data policy introduced at the journal Cognition led to a substantial increase in available data, but a considerable portion of this data was not reusable. For data to be reusable, it needs to be clearly structured and well-documented. Open data alone will not be enough to achieve the benefits envisioned by proponents of data sharing.

Where Do the Numbers Published in Scientific Articles Come From?

Where Do the Numbers Published in Scientific Articles Come From?

Study attempts to reproduce values reported in 35 articles published in the journal Cognition revealed analysis pipelines peppered with errors. Elements of a reproducible workflow that may help to mitigate these problems in future research are outlined.

Data Sharing in PLOS ONE: An Analysis of Data Availability Statements

Data Sharing in PLOS ONE: An Analysis of Data Availability Statements

Only about 20% of statements indicate that data are deposited in a repository, which the PLOS policy states is the preferred method. More commonly, authors state that their data are in the paper itself or in the supplemental information, though it is unclear whether these data meet the level of sharing required in the PLOS policy.

Plan to Replicate 50 High-Impact Cancer Papers Shrinks to Just 18

Plan to Replicate 50 High-Impact Cancer Papers Shrinks to Just 18

An ambitious project that set out nearly 5 years ago to replicate experiments from 50 high-impact cancer biology papers, but gradually shrank that number, now expects to complete just 18 studies.

Philip Zimbardo Defends the Stanford Prison Experiment, his Most Famous Work

Philip Zimbardo Defends the Stanford Prison Experiment, his Most Famous Work

What’s the scientific value of the Stanford Prison Experiment? Zimbardo responds to the new allegations against his work.

Can Auditing Scientific Research Help Fix Its Reproducibility Crisis?

Can Auditing Scientific Research Help Fix Its Reproducibility Crisis?

New research predicts that audits would reduce the number of false positive results from 30.2 per 100 papers to 12.3 per 100.

Before Reproducibility Must Come Preproducibility

Before Reproducibility Must Come Preproducibility

Most papers fail to report many aspects of the experiment and analysis that we may not with advantage omit - things that are crucial to understanding the result and its limitations and to repeating the work.  Instead of arguing about whether results hold up, we should strive to provide enough information for others to repeat the experiments.

Failures Are Essential to Scientific Inquiry

Failures Are Essential to Scientific Inquiry

Reproducibility failures occur even in fields such as mathematics or computer science that do not have statistical problems or issues with experimental design. Suggested policy changes ignore a core feature of the process of scientific inquiry that occurs after reproducibility failures: the integration of conflicting observations and ideas into a coherent theory.

 

A Remedy for Broken Science, or an Attempt to Undercut It?

A Remedy for Broken Science, or an Attempt to Undercut It?

Reproducibility issues pose serious challenges for scientific communities. But what happens when those issues get picked up by political activists? A report from the National Association of Scholars takes on the reproducibility crisis in science. Not everyone views the group’s motives as pure.

Science's 'Irreproducibility Crisis' Is a Public Policy Crisis Too

Science's 'Irreproducibility Crisis' Is a Public Policy Crisis Too

Congress will have to pay for some steps to ensure greater reproducibility in the sciences. In the end, those steps will save enormous amounts now spent building blind allies and mirages. What’s needed are standardized descriptions of scientific materials and procedures, standardized statistics programs, and standardized archival formats. 

The Irreproducibility Crisis of Modern Science: Causes, Consequences, and the Road to Reform

The Irreproducibility Crisis of Modern Science: Causes, Consequences, and the Road to Reform

This study by the National Association of Scholars examines the different aspects of the reproducibility crisis of modern science. The report also includes a series of policy recommendations, scientific and political, for alleviating the reproducibility crisis.

The Oxford Reproducibility School

The Oxford Reproducibility School

A series of talks on robust research practices in psychology and the biomedical sciences, held in Oxford in 2017. Organized by Dorothy Bishop, Ana Todorovic, Caroline Nettekoven and Verena Heise.

An Empirical Analysis of Journal Policy Effectiveness for Computational Reproducibility

An Empirical Analysis of Journal Policy Effectiveness for Computational Reproducibility

New guidelines from many journals requiring authors to provide data and code postpublication upon request is found to be an improvement over no policy, but currently insufficient for reproducibility.

Online Forums Give Investors an Early Warning of Shady Scientific Findings

Online Forums Give Investors an Early Warning of Shady Scientific Findings

Scientists around the globe nowadays regularly take to the internet to scrutinize research after it’s been published — including to run their own analyses of the data and spot mistakes or fraud.