Skip to Main Content

Data Management for Wits: Debate on reproducibility crisis

The following is general advice,data varies hugely between types of research and projects.

Reproduciblity crisis

Reproducibility crisis is:

Scientific reproducibility is critical for biomedical research as it enables us to advance science by building on previous results, helps ensures the success of increasingly expensive drug trials,and allows funding agencies to make informed decisions.However,there is a growing"crisis" of reproducibility as evidenced by a recent Nature journal survey of more than 1500 researches  that found that 70% of researchers were not able to replicate results from other research groups and more than 50% of researchers were not able reproduce their own research  results.


Is science really facing a reproducibility crisis, and do we need it to?

Efforts to improve the reproducibility and integrity of science are typically justified by a narrative of crisis,according to which most published results are unreliable due to growing problems with research and publication practices.This article provides an overview of recent evidence suggesting that this narrative is mistaken,and argues that a narrative of epochal changes and empowerment of scientists would be more accurate,inspiring,and compelling.Click Here!


Is There a Reproducibility Crisis in Science?

Nature asked 1,576 scientists for their thoughts on reproducibility.Most agree that there's a 'crisis' and over 70% said they'd tried and failed to reproduce another group's experiments. Click here


Principles and Guidelines for Reporting Preclinical Research

NIH held a joint workshop in June 2014 with the Nature Publishing Group and Science on the issue of reproducibility and rigor of research findings,with journal editors representing over 30 basic/Preclinical science journals in which NIH-funded investigators have most often published.The workshop focused on identifying the common opportunities in the scientific publishing arena to enhance rigor and further support research that is reproducible,robust,and transparent.Click here


Reproducibility crisis in science or unrealistic expectations?

Science appears to be in a crisis caused by the failure to replicate published results,which is undermining confidence in the scientific literature.This reproducibility crisis is not only evident in large‐scale replication efforts to evaluate studies from various laboratories,but also within laboratories themselves.The problem has been extensively discussed among the scientific community,as many scientists have had troubles with replication themselves.A recent survey of 1,576 researchers found that around 90% agreed that the reproducibility crisis is real.Click here


Scientific productivity: An exploratory study of metrics and incentives

Competitive pressure to maximize the current bioclimatic measures of productivity is jeopardizing the integrity of the scientific literature.Efforts are underway to address the ‘reproducibility crisis’ by encouraging the use of more rigorous,confirmatory methods.However, as long as productivity continues to be defined by the number of discoveries scientists publish,the impact factor of the journals they publish in and the number of times their papers are cited,they will be reluctant to accept high quality methods and consistently conduct and publish confirmatory/replication studies.This exploratory study examined a sample of rigorous Phase II-IV clinical trials,including unpublished studies,to determine if more appropriate metrics and incentives can be developed.Click here.

Transparency in reporting NIH and Medical Journal Guidelines

Transparency in reporting NIH and Medical Journal Guidelines.

 Extensive methods :Hit a  checklist : Use a standard: community-based standards

 Replicates:
  •  how often each experiment was performed under a range of conditions.  distinguish between independent biological data points and technical replicates.
 Statistics:
  • fully reported in the paper:N, definition of center, dispersion and precision
 Randomization: Yes – How  No – why not Blinding:
  • Yes group assignment/ outcome assessment Sample-size estimation: computed when and statistical method if NO , how sample was created, NO MAKING STUFF UP AFTERWARD
 Inclusion and exclusion criteria: State criteria: include negative results           
  • supplemental measurements not reported.

SOP/ Good Clinic Practices/ Ethics

 Yes actually you already do all this
 BUT
 No really disorganized researcher get published because you wont finish your research.
 THINGS HAVE BECOME COMPUTATION
 And you are not illegal or unethical
 And practices have lagged behind
 You are using Redcap or other clinical
 1 Guidelines
 You are using Zotero  and you are using a Lab Management system and or Electronic Lab books Or OPEN SCIENCE FRAMEWORK
 2 Teaching Modules 
 3 Consult with a Bio stats  FIRST

A side word on Publication Biometrics DON’T WORK

A side word on Publication Biometrics  DON’T WORK 
  But we keep using them anyway.
  So publish early often and please expect rejections.Its normal to be rejected from up to 5 journals and often can take up to a year in review.

Why WE hate supplemental material

Why WE  hate  supplemental material

 What does it MEAN: various journals
 References  Acknowledgements and important methodologies are buried
1.NOT help but give context
They are not indexed and citation are not counted.
2.Help with the understanding
3.Extra to but not necessary to understand
 They are not copy edited
4.Or be the data
 PLEASE AT LEAST PUT ALL REFERENCES
IN A REFERENCE LIST

Is there a crisis?

Maybe there is no crisis?

 Reproducibility is actually NOT what we do
SCOOPED
 REUSE
 DATA PARASITES
 SHARING is CAREING  = Preprint
 Some people are just criminal ( especially developing countries people)
 But its seems like maybe more and NO its not the people but there are less controls in developing country science
 You just want to destroy me/my work /my field
 Data Scientist
 Science does not work that way

Well it COULD