The Four-Step Robustness Check: Assessing and Improving Robustness of Psychological Science


Psychological findings underlie important policy decisions concerning e.g., health, well-being, and education. It is therefore imperative that we can trust these findings. However, increasing evidence shows that many published psychological effects might not be robust, but overestimated or even false. Many researchers have suggested performing replication studies in new samples as a way to assess robustness of published results, but these come with considerable cost in time and money. Instead, I propose an efficient four-step robustness check that aims at verification of reported results before collecting new data. This strategy consists of the following steps: 1. Check the internal consistency of the reported statistical results 2. Reanalyze the data using the original analytical strategy to see if the reported conclusions hold 3. Check if the original result is robust to alternative analyses of the original data 4. Perform a replication study in a new sample The more steps are successful, the more robust the result. This implies that to assess robustness of a result, it might initially suffice to check the consistency of the statistics in the paper. The strength of this approach is that earlier steps are easier and cheaper to execute than later steps. This project has two main objectives. First, we will develop an open-source, interactive protocol to efficiently assess robustness of a result. We will apply this protocol to a set of studies that have already been replicated to assess to what extent replication failure is linked to failures in steps 1-3 of the protocol. Second, we will use the protocol to design efficient interventions to improve robustness of results. Specifically, we will conduct two experiments in collaboration with the publisher PLOS and the cognitive journal Cortex. This project could provide an efficient strategy to assess and improve robustness in psychology, and possibly related fields. 


  • dr. Michèle Nuijten

  • Cas Goos (PhD student)