Voluntary commitment to transparency, openness,
and reproducibility in scientific research

Rationale

Transparency, openness, and reproducibility are fundamental scientific values that most researchers ascribe to. Nevertheless, these values are not always reflected in our current scientific practice.

There is a lack of transparency:

  • the publication record is biased towards positive and surprising findings, whereas null findings and direct replications are almost absent;
  • questionable research practices, such as p-hacking and hypothesizing-after-results-are-known, are prevalent;
  • important study information, such as who contributed in which roles, how sample size was determined, and the full list of measures that were collected, often remains undisclosed.

There is a lack of openness:

  • methods and data are largely unavailbe for reanalysis, replications, or reuse;
  • many papers are inaccessible because they are locked up behind publisher's paywalls.

There is a lack of reproducibility:

  • too many careless errors slip into scientific articles
  • best-practices in reporting of methods and results are not followed and important details are missing
  • backup and version control systems are not consistently used, limited or no metadata is created, and there is no long-term preservation strategy

This needs to change. We strongly believe that transparency, openness, and reproducibility are key ingredients of good science and that following these prinicples in our daily work, in fact, increases the impact, information value, and credibility of our research.

Here, we express a voluntary commitment about how we do research. The guidelines and checklists below are effective for all empirical research projects that we lead or manage. Guidelines and checklists were inspired by a set of resources listed at the end of this document.

Note that to every guideline there can be justfied exceptions. However, whenever we deviate from these guidelines, we will document an explicit, written justification for why we do so in a public document (e.g. preprint/publication, the README file enclosed with the shared data, or the lab website). Also, this is a dynamic document. This means that we will refine it as we use it.

As signatories, we warrant to follow these guidelines from the day of signature on.

 

Guidelines

Transparency:
Reflecting our commitment to transparency, we:

  • specify contributor roles at the beginning of the project and update them if necessary
  • register every aspect of a project we want to get credit for before data collection starts and provide a link to the preregistration in the manuscript
  • include the "21-word solution" in the manuscript, describing how we determined our sample size, all data exclusions (if any), all manipulations, and all measures in the study (* make available all results, including null findings, through published journal article or through public repositories, or both)

Openness:
Reflecting our commitment to openness, we:

  • deposit all source code underlying our publications in a public repository
  • share the data underlying the main findings, and the raw data whenever possible, through a public repository
  • provide open access to our publications, either through self-archiving (green open access) or through open access publishing (gold open access)

Reproducibility:

Reflecting our commitment to reproducibility, we:

  • backup and keep all human-generated research output (e.g. code, graphics, manuscripts) under version control
  • automate as many data analysis steps as possible
  • encourage code reviews
  • follow best practices in coding, project organization, and study reporting

 

Checklists

To monitor compliance with these guidelines, we use checklists at three stages of the research cycle: before beginning collection, after completing data analysis, and before submitting the manuscript for publication.

Checklist 1: before beginning data collection

  • Checked the transparency, openness, and reproducibility requirements set by the research institution and funding agency, if any
  • Assigned contributor roles (independent of authorship, using taxonomy by Brand et al Learned Publishing 2015)
  • Registered all relevant study information (e.g. hypotheses, data collection stopping rule, experimental conditions, collected measures, exclusion criteria, analysis plan) in a study plan on an established online platform (e.g. Open Science Framework, AsPredicted)
  • Verified that backup and version control systems are in place for all human-generated research output

Checklist 2: after completing data analysis

  • Verified that study plan has been followed and provide clarifications for departures, if any
  • Automated data analysis pipeline
  • Updated contributor roles, if necessary
  • Verified that best-practices of coding and organization of project files have been followed
  • Invited one or more colleagues to review the code

Checklist 3: before submitting the manuscript for publication

-21 word solution

  • Verified that the transparency, openness, and reproducibility requirements set by the research institution, funding agency, and journal, if any, are met
  • Deposited all source code (e.g. for stimulus presentation, modeling, and analysis) and associated documentation and metadata in a public repository
  • Shared the data, figures/tables, and plotting scripts underlying the main findings in a public repository
  • Shared the raw data and metadata through a public repository
  • Self-archived the manuscript at a public repository or preprint server
  • Disclosed all essential study information in the manuscript, including funding, competing interests, contributor roles, and links to preregistration, code, and data
  • Clarified any departures from study preregistration, if any, in the manuscript
  • Reported Methods and Results according to journal requirements and best practices
  • Made available all findings not reported in the manuscript elsewhere

 

Resources

This lab policy and guildelines were heavily influenced by the following documents:

 

The guidelines has been written by Bram Zandbelt (2017).