This file collects information related to the reproducibility and replicability of research outcomes. Since the two concepts are often mixed up, here are brief definitions:
- reproducibility is concerned with retracing a particular study and mapping its methodology to its outcomes, whereas
- replicability is about independent verification of research outcomes using different set-ups.
- Analyze all Jupyter notebooks mentioned in PubMed Central
- has numerous links to further materials on the matter
- Reproducibility crisis: Blame it on the antibodies
- TOP guidelines
- Data Carpentry's Reproducible Research Curriculum
- NIH plans to enhance reproducibility
- "Ensure rigor and reproducibility" is part of the Framework for the NIH Strategic Plan
- Rigor and Reproducibility
- related page
- blog post: Enhancing Reproducibility in NIH-supported Research through Rigor and Transparency
- NIH notices
- Enhancing Reproducibility through Rigor and Transparency
- Consideration of Sex as a Biological Variable in NIH-funded Research
- Offline: What is medicine's 5 sigma?
- quote: "Can bad scientific practices be fixed? Part of the problem is that no-one is incentivised to be right. Instead, scientists are incentivised to be productive and innovative. Would a Hippocratic Oath for science help? Certainly don’t add more layers of research red-tape. Instead of changing incentives, perhaps one could remove incentives altogether. Or insist on replicability statements in grant applications and research papers. Or emphasise collaboration, not competition. Or insist on preregistration of protocols. Or reward better pre and post publication peer review. Or improve research training and mentorship. Or ..."
- Research Compendia
- The Recomputation Manifesto
- Virtual machines considered harmful for reproducibility
- Replicated Computational Results
- Experimental Reproducibility Has Always Been Hard But Cooperation Could Make It Easier
- quote from page 2: "Another useful shift would be for the scientific community, and especially funders, to recognize replication as an essential step of the knowledge-building process, and to provide both incentives and support for efforts to reproduce published results. As long as “important” scientific work is viewed narrowly — be the first across the finish line to a brand new discovery — efforts to reproduce already-reported findings will be viewed as lower priority. If it can be seen as the valuable intellectual labor it is (because, given all the ways it can go wrong, replication is hard), more scientists may see it as labor worthy of their time, effort, and creativity."
- What about a funding call specific to reproducibility?
- either by identifying a reproducibility partner by study or by twinning labs to reproduce each other's results
- What about a funding call specific to reproducibility?
- quote from page 2: "Another useful shift would be for the scientific community, and especially funders, to recognize replication as an essential step of the knowledge-building process, and to provide both incentives and support for efforts to reproduce published results. As long as “important” scientific work is viewed narrowly — be the first across the finish line to a brand new discovery — efforts to reproduce already-reported findings will be viewed as lower priority. If it can be seen as the valuable intellectual labor it is (because, given all the ways it can go wrong, replication is hard), more scientists may see it as labor worthy of their time, effort, and creativity."
- Journal of Visualized Experiments as an approach to reproducibility
- In science, irreproducible research is a quiet crisis
- Studies show many studies are false
- On radical manuscript openness
- How can we perpetuate reproducibility?
- NWO to start a pilot project that will fund replication research
- Trials and errors: The evidence base for new medicines is flawed. Time to fix it
- The first imperative: Science that isn’t transparent isn’t science
- Self-correction in science at work
- Developer of Robot Scientist Wants to Standardize Science
- Registered clinical trials make positive findings vanish
- To understand the replication crisis, imagine a world in which everything was published
- quotes earlier post: "It’s a strange view of science in which a few referee reports is enough to put something into a default-believe-it mode, but a failed replication doesn’t count for anything."
- From Peer-Reviewed to Peer-Reproduced in Scholarly Publishing: The Complementary Roles of Data Models and Workflows in Bioinformatics
- Irreproducibility: A $28B/Year Problem with some Tangible Solutions
- Robust research: Institutions must do their part for reproducibility
- Metascience could rescue the ‘replication crisis’
- How scientists fool themselves – and how they can stop
- How open data can improve medicine
- The natural selection of bad science
- Reproducible Research: Citing your execution environment using Docker and a DOI
- Tools for reproducibility
- Veritasium: Is Most Published Research Wrong?
- Five selfish reasons to work reproducibly
- Reproducible practices are the future for early career researchers
- Reproducibility and reliability of biomedical research
- Which mistakes do we actually make in scientific code?
- Critical Assessment of protein Structure Prediction (CASP)
- infrastructure around making protein structure prediction reproducible
- Reproducible and replicable CFD: it's harder than you think
- Quantifying Reproducibility in Computational Biology: The Case of the Tuberculosis Drugome
- Social Media and Suicide: A Critical Appraisal
- comments on a reanalysis of Increases in Depressive Symptoms, Suicide-Related Outcomes, and Suicide Rates Among U.S. Adolescents After 2010 and Links to Increased New Media Screen Time based on a different dataset
- Avoidance of sharing code was at the core of a long debate around metastability of supercooled water (seen here)
- Team of researchers challenge bold astronomical prediction
- essentially about wrong information in a source paper leading to bold claims in a paper that could be rejected by checking the information in the source papers
- Taking Reproducible Research in Robotics to the Mainstream: Building on the IEEE RAM R-Articles
- Commitment to Research Transparency
- Peer Reviewers’ Openness Initiative
- LIGO resources on analyzing gravitaitonal wave data
- LIGO Binder of initial gravitational wave detection
- Public Gravitational Wave Alerts
-
"LIGO and Virgo will publicly announce candidate gravitational wave triggers with a high likelihood of astrophysical origin within minutes of the waves arriving in the three detectors (LIGO Hanford, LIGO Livingston, and Virgo)"
- Gravitational Wave Open Science Center
- LIGO Data Management Plan
- Data
- Data analysis guide
-
- BIG Data, BIG responsibility — (Reproducible paper template for research data management
- "Michel Lalande wirkte bei der Vorbereitung eines Sternkatalogs mit und beobachtete Neptun jeweils am 8. und 10. Mai 1795. Er hielt den Leuchtpunkt für einen Stern und trug ihn zunächst in eine Karte ein. Zwei Tage später korrigierte er die Position, da er sich über den Eintrag nicht mehr sicher war. Dadurch nahm er sich die Möglichkeit, diese Positionsänderung als Zeichen einer Planetenbewegung zu erkennen, so dass ihm die Entdeckung entging." — if only he had known about version control