Robbie van Aert Wins a Veni Grant for his Research on Meta-Analysis, Preregistration, and Replication

We are very happy to announce that Robbie van Aert was awarded a NWO Veni grant (€280.000) for his research entitled “Empowering meta-analysis by taking advantage of preregistered and replication studies”.

 

Below you can find a short description of the proposed research:

 

An important threat to the validity of meta-analyses is publication bias. Replication and preregistered studies are deemed less susceptible to publication bias. I will develop a novel meta-analysis methodology that optimally synthesizes conventional with replication/preregistered studies and corrects for publication bias. This new methodology yields more accurate conclusions in meta-analyses.

Young eScientist Award to Improve "statcheck"

The Netherlands eScience Center awarded our team member Michèle Nuijten and our colleague from social psychology Willem Sleegers the Young eScientist Award for their proposal to improve statcheck. The prize consists of €50,000 worth of expertise from the eScience Center, which will be used to expand statcheck’s search algorithm with more advanced techniques such as Natural Language Processing.

‘We are […] very excited to collaborate with the eScience Center to improve statcheck’s searching algorithm to make it ‘smarter’ in recognizing statistical results so that it can also spot errors in other scientific fields. We are confident that by collaborating with the eScience Center, we can expand statcheck to improve scientific quality on an even larger scale.’
— Michèle & Willem

NWO Veni Grant for the 4-Step Robustness Check

We are happy to announce that a €250,000 NWO Veni Grant was awarded to Michèle Nuijten for her proposed 4-Step Robustness Check.

She describes the project on her website:

To check the robustness of a study could replicate it in a new sample. However, in my 4-Step Robustness Check, you first verify if the reported numbers in the original study are correct. If they’re not, they are not interpretable and you can’t compare them to the results of your replication.

Specifically, I advise researchers to do the following:

  1. Check if there are visible errors in the reported numbers, for example by running a paper through my spellchecker for statistics: statcheck

  2. Reanalyze the data following the original strategy to see if this leads to the same numbers

  3. Check if the result is robust to alternative analytical choices

  4. Perform a replication study in a new sample

The 4-Step Robustness Check can be used to efficiently assess robustness of results

The 4-Step Robustness Check can be used to efficiently assess robustness of results

This 4-step check provides an efficient framework to check if a study’s findings are robust. Note that the first steps take way less time than a full replication and might be enough to conclude a result is not robust.

The proposed framework can also be used as an efficient checklist for researchers to improve robustness of their own results:

  1. Check the internal consistency of your reported results

  2. Share your data and analysis scripts to facilitate reanalysis

  3. Conduct and report your own sensitivity analyses

  4. Write detailed methods sections and share materials to facilitate replication

Ultimately, I aim to create interactive, pragmatic, and evidence-based methods to improve and assess robustness, applicable to psychology and other fields.

I would like to wholeheartedly thank my colleagues, reviewers, and committee members for their time, feedback, and valuable insights. I’m looking forward to the next three years!

Awarded a Campbell Methods Grant

Campbell-Collaboration-logo.png

We are honored to announce that Michèle Nuijten was awarded a $20,000 methods grant from the Campbell Collaboration, together with meta-analysis expert Joshua R. Polanin. They were awarded the grant for the project “Verifying the Accuracy of Statistical Significance Testing in Campbell Collaboration Systematic Reviews Through the Use of the R Package statcheck”. The grant is part of the Campbell Collaboration’s program to supporting innovative methods development in order to improve the quality of systematic reviews. For more information about the grant and the three other recipients, see their website here.

ERC Consolidator Grant for Jelte Wicherts

iu-2.jpeg

Jelte Wicherts has been awarded a prestigious 2 million euro Consolidator Grant from the European Research Council (ERC). With the money the meta-research group will be expanded with two postdocs and two PhD students.

The project is entitled IMPROVE: Innovative Methods for Psychology: Reproducible, Open, Valid, and Efficient and will start in the second half of 2017.

Leamer-Rosenthal Prize for statcheck

Michèle Nuijten and Sacha Epskamp are two of the nine winners of the 2016 Leamer-Rosenthal prize for Open Social Science for their work on statcheck. This prize is an initiative of the Berkeley Initiative for Transparency in the Social Sciences (BITSS), and comes with a prize of $10,000. They will receive their prize at the 2016 BITSS annual meeting, along with seven other researchers and educators.

Read more here.