Open Science

Updated: Apr 23, 2019

Reproducibility, Replicability & Reliability made easy (hopefully!)


After attending a series of workshops at the British Neuroscience Association conference, I realised how little I really knew about Open Science. I have decided to curate a list of resources to help me (and hopefully others), learn more about how to make our research more open. This list is a work in progress (and by no means exhaustive) and suggestions for resources to be added are more than welcome (cemail me or tweet @AleLautarescu)

Why open science


A good starting point is reading Munafo et al. (2017) - A manifesto for reproducible science https://www.nature.com/articles/s41562-016-0021.pdf

"Open science refers to the process of making the content and process of producing evidence and claims transparent and accessible to others."


Threats to reproducible science (Munafo et al. 2017)



What can you do


It is important to understand that open science is not an all-or-nothing approach. You can start by implementing only one or two of the suggestions below.


You can join your local UK Reproducibility Network hub. A list of hubs can be found at https://www.bristol.ac.uk/psychology/research/ukrn/networks/


Planning your research project

  • Conduct a systematic review when formulating your hypotheses. Systematic reviews reveal uncited studies, in the way that reading reference lists may not.

  • Conduct a sample size estimation (Low statistical power increases the likelihood of obtaining both false-positive and false-negative results!). You can also find and use existing datasets (https://toolbox.google.com/datasetsearch)

  • Preregistration of study design, primary outcome and analysis plans. You can pre-register on the Open Science Framework (http://osf.io/) or AsPredicted.org. Journals have also started adopting Registered Reports(i.e. peer review before results are known). This helps eliminate the bias against negative results & prevent HARKING (i.e. hypothesising after results are known) and P-hacking (i.e. analysis decisions made with knowledge of the observed data). You can find a list of journals accepting Registered reports at https://cos.io/rr/. Lastly, there are other options for publishing your peer-reviewed protocol (such as https://bio-protocol.org/default.aspx)


Infographic from osf.io


Analysing your data

  • Attend training courses in methodology and stats (this will help you understand what p-values actually mean, the importance of statistical power and effect sizes etc). There are some good free online courses on Coursera (such as https://www.coursera.org/learn/statistical-inferences) and other resources that you can implement, such as StatCheck - which checks for errors in statistical reporting (http://statcheck.io/)

  • Use open-source software (such as R) for data analysis. As the analysis takes the form of scripts, you already have a full, reproducible record of your analysis path (which you can then share). For info on how to make your data reproducible with R, see online courses on Open Science & Reproducibility (https://github.com/cbahlai/OSRR_course & rmarkdown (https://github.com/libscie/rmarkdown-workshop)

  • Make sure you are explicit about what analyses are hypothesis-based and which are exploratory (pre-registering helps with this).


Publishing / Sharing your data



Additional resources:



Copyright:

Image by https://commons.wikimedia.org/wiki/File:Knowledge-sharing.jpg







  • Twitter Social Icon

CONTACT:

Email                        alexandra.lautarescu@kcl.ac.uk

Twitter                     @AleLautarescu 

Correspondence      Centre for Developing Brain 

                                  King's College London 

                                  1st floor, South Wing, St Thomas' Hospital 

                                  London, SE1 7EH

This site was designed with the
.com
website builder. Create your website today.
Start Now