Reproducibility & Replicability In Science

2y ago
23 Views
3 Downloads
3.39 MB
20 Pages
Last View : 14d ago
Last Download : 3m ago
Upload by : Philip Renner
Transcription

Reproducibility &replicability in sciencePerspective from across-disciplinaryjournal editorVéronique KiermerExecutive Editor, PLOSPublic Library of ScienceNASEM Committee on Reproducibility andReplicability of Science 1st meeting Dec 2017

Agenda1. Measures of R&R challenges across disciplines2. Cross-disciplinary approaches by journals:1.Editorial policy interventions to improve reporting: twoexperiments at Nature and PLOS2.Open science: PLOS data availability policy as anexample3. The incentive system as an underlying issue

R&R challenges and measures acrossdisciplines and in multi-disciplinary research Documented evidence of R&R challenges is disciplinespecific Increased interest in R&R in many disciplines Normative variations between disciplines Challenges of assembling editorial and peer-reviewexpertise for multi-disciplinary studies.

Reproducibility studies across the sciencesMulti-disciplinarySocial sciences, Psychology*Biological & Biomedical* Clinical medicineChemistry,Physics, Space an, Fanelli, Ioannidis, Sci. Transl. Med. 2016 doi: 10.1126/scitranslmed.aaf5027

Examples of differences that affect theapproach to reproducibility in distinctscientific domains.Degree of determinismSignal to measurement-error ratioComplexity of designs and measurement toolsCloseness of fit between hypothesis and experimental design ordataStatistical or analytic methods to test hypothesesTypical heterogeneity of experimental resultsCulture of replication, transparency, and cumulating knowledgeStatistical criteria for truth claimsPurposes to which findings will be put and consequences of falseconclusionsGoodman, Fanelli, Ioannidis, Sci. Transl. Med. 2016 doi: 10.1126/scitranslmed.aaf5027

Editorial policy interventions:two experiments with ‘checklists’ to improve reporting Implementation challenges for journals Tension between comprehensive, specialist requirements andnon-exhaustive, generalist requirements.

Editorial policy interventions to improvereporting of in vivo research PLOS ONE mandate of the ARRIVE checklist Normal editorial process Randomized controlled trialNature journals mandate of their own reporting checklistapplicable across all life sciences Part of larger policy change, increased scrutiny Retrospective, control cohort studyImpact measured independently by research teams of EmilySena and Malcolm Macleod, University of Edinburgh,CAMARADES collaboration.

ARRIVE guidelines: Developed by UK NC3R 20-point checklist specific to in vivoresearchKilkenny et al, PLOS Biology, June 2010https://doi.org/10.1371/journal.pbio.1000412

Developed following NINDSstakeholder meeting June 2012 4 critical elements (aka “Landis 4”) RandomizationBlindingSample- ‐size estimationData handlingLandis et al., Nature. October 2012 doi: 10.1038/nature11556.

Nature journals checklist1. Checklist of reporting standards 18-point checklist across all lifesciences Including ”Landis 4”2. Eliminated length limits formethods sections3. Increased scrutiny of statistics4. Re-emphasized data sharingnature.com/authors/policies/checklist.pdf

PLOS ONE submissionsn 845 intervention ; n 844 controlMarch 2015 – June 2015Authors, academic editors, and peer reviewerswere blinded to the study and allocationØ No publishedarticle achievedfull complianceØ Significantimprovement inreporting of only 2sub-items

n 394 intervention (NPG); n 353 matching control (non-NPG)

Substantial improvement in reporting of riskbias in in vivo researchMacleod and the NPQIP Collaborative GroupbioRxiv Sep 2017 doi:10.1101/187245

What have we learned? Simply asking authors to fill out a checklist is notsufficient to improve reporting. Focused attention on fewer key items appears moreeffective. Even engaged editorial attention does not lead to fullcompliance. Compliance improves over time. Reporting improvements study design improvements. Substantial ambiguity in checklists formulation. Ongoing next steps: editorial collaboration betweenjournals to establish minimal standard checklist asfoundation.

Open Scienceas a cross-disciplinary approachOpen Science, as definedby the availability of keyelements of study design,raw results and analysismethods, fosterstransparency andfacilitates replication. Data Code Methodologies Reagents

An example: PLOS Data Availability Policy PLOS journals require authors to make all data underlyingthe findings described in their manuscript available atpublication. Since March 2014, PLOS has published 87,000 articleswith a Data Availability Statement describing compliancewith this policy. Exceptions to public availability are made occasionallyfor ethical or legal reasons. 0.1% rejection for unwillingness or inability to share data

FAIR data principles Findable – unique persistent identifier, rich metadata Accessible – retrievable (incl. authorization wherenecessary) Interoperable – broadly applicable language,qualified references Reusable – metadata, usage license, provenance,domain-relevant community standards.Importance of FAIR principles as open access to data isnot always sufficient and not always necessary to addressR&R challenges.

Underlying issues affecting reproducibilityacross disciplines“Given finite resources, the importance placed on novel findings, and the emphasis on arelatively small number of publications, scientists wishing to accelerate their careerprogression should conduct a large number of exploratory studies, each of which will havelow statistical power.”Higginson and Munafo, PLOS Biology, 2016, doi: 10.1371/journal.pbio.2000995“As competition for jobs and promotions increases, the inflated value given to publishing ina small number of so-called “high impact” journals has put pressure on authors to rush intoprint, cut corners, exaggerate their findings, and overstate the significance of their work.”Alberts, Kirschner, Tilghman and Varmus, PNAS 2014, vol. 11 (16), 5773.doi: 10.1073/pnas.1404402111

Limitations of the scientific literature Insufficient distinction between exploratory and hypothesistesting research, in particular in broadly defined biomedicalresearch.Publication bias affects ability to draw reliable conclusionsabout phenomena.Publishing solutions exist to counter these limitations:ü Broad-scope, inclusive journals with selection criteria focuseson rigor instead of perceived importance (PLOS ONE,Scientific Reports, )ü Increasing momentum of preprint servers.Better incentives are needed to promote best practices andprevent damaging behaviors.

In conclusion Open Science is a cross-disciplinary approach to tacklereproducibility and replicability at the publication level. Reporting transparency is a pre-requisite across disciplines,but standards need to be defined with domain expertise. Journals editorial policy interventions can help butimplementation is challenging. Better incentives are needed to promote open science,facilitate and reward replication and validation, counterpublication bias, and improve rigor and reproducibility.

Open Science is a cross-disciplinary approach to tackle reproducibility and replicability at the publication level. Repo

Related Documents:

Replicability is stronger than reproducibility Replicability introduces other variables like different researchers, equipment, . Replicability crisis in Science “The test of replicability, as it’s known, is the foundation of modern research. Replicabilit

Reproducibility and Replicability in Science or the National Academies of Sciences, Engineer-ing, and Medicine. Reproducibility and Replicability in Science, A Metrology Perspective A Report to the Nat

NASEM Consensus Study Report on Reproducibility and Replicability in Science, 2019; Christinsen, Freese, Miguel. Transparent and Reproducible Social Science Research, 2019 “Concerns about reproducibility and replicability have been expressed in both scien

Replicability reproducibility different groups can obtain the same result independently by following the original study’s methodology. . Camerer et al. (2018) Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015. Nature Human Behavior 2. Collberg et

Reproducibility and replicability of research results have gained . [Open Science Collaboration et al. 2015] to artificial intelligence [Hutson 2018] over the lack of reproducibility, and one could wonder abou

transparency, reproducibility and replicability of several components of systematic reviews with meta-analysis of the effects of health, social, behavioural and educational interventions. Methods: The REPRISE (REProducibility and Replicability In

Andreas Buja (Wharton, UPenn) Reproducibility — Replicability: P-values and the Larger Questions 2015/02/26-27 4 / 1 Two Types of Reform: (1) Economics !Journals Journals : S

agile software development methodologies (SDMs) utilizing iterative development, prototyping, templates, and minimal documentation requirements. This research project investigated agile SDM implementation using an online survey sent to software development practitioners worldwide. This survey data was used to identify factors related to agile SDM implementation. The factors that significantly .