Best practice for reproducible research

Reproducibility in research is the ability of a researcher to duplicate the results of another study with a degree of concordance that makes the original study at least believable. The impact of irreproducible results was discussed in my last blog.

Global investment in biomedical research is over US $100 billion annually, and leads to major breakthroughs that pay dividends to human health. But not all research translates into benefit, however noble the intention of the researcher. Many studies legitimately test valid research hypotheses that might shed light on disease processes, but may prove to be unactionable. It’s been estimated that ~85% of research investment is wasted annually because of problems that can be corrected. More information on reducing waste and increasing research value can be found in a Lancet series of five articles published in 2014.

Although deliberately falsifying or fabricating data is most damaging to science, it is less frequent compared to more widespread forms of misconduct that can often fly under the radar. The ‘most worrying misbehaviors’ cited in a survey of ~1300 scientists attending international research integrity conferences, were protocol deviations, selective reporting of positive results and insufficient reporting of study flaws and limitations, quality assurance failures, poor mentoring of junior scientists, and turning a blind eye to the misconduct of co-workers.

While there are many drivers of the scientific culture that generates ‘sloppy science’, it is quite possible to remedy this by introducing practices that have worked for some scientific disciplines. For example, guidelines developed by the NCI-NHGRI Working Group on replication in genetic association studies have led to large-scale collaborations that have transformed genetic and molecular epidemiology from one of spurious associations, to a highly credible one.

Standards exist in many industries that define best practice for various systems that ensure consistency, quality and integrity, and procedures to ensure compliance.

The following list is by no means exhaustive, but outlines some of the ways to ensure reproducibility.

  • Schedule routine and informal ‘catch-ups’ between lab personnel. If done in the spirit of collaboration and good scientific citizenship, this will go a long way to encourage honesty and admission of errors without fear of retribution. A major failure by lab heads is a ‘hands-off’ approach to problems that foster poor morale that could lead to ‘sloppy science’.
  • Develop a system of sample labeling, logging and storage organization. The aim should be about accuracy rather than convenience or quick retrieval. Where multiple users access the same samples, a log should be maintained of where the sample came from, how old it is, and who did what to it. An open and transparent lab culture, good citizenship, and collaboration and cooperation go a long way to ensuring quality research.
  • Standardize protocols. The old adage ‘if it ain’t broke don’t fix it’ has no better place than in lab-based research. Protocols should be standardized and followed to the letter. If there are deviations, they should be documented and reasons given. In the interest of reproducibility, it should be standard practice that another individual in the group repeats the experiment. Senior research staff with many years of valuable experience and ‘magical hands’ should be encouraged to do the final replication before publication. 
  • Don’t skimp on equipment maintenance. Equipment and key lab instruments such as pipettors should be calibrated on a regular schedule and documentation of this kept.
  • Good technique is everything. Correct use of instruments should be an integral part of training for junior lab members. I’ve had the experience where I could not understand why one person’s 50 μl volume was consistently ~10-15 μl more than it should be. It turned out he was a bit ‘heavy-handed’ with his pipettor. The simple ‘eye-ball’ test by experienced lab personnel will quickly identify the source of problems that contribute to variability and experimental errors. 
  • Get rid of old and outdated reagents. Most research institutes and organizations have clear policies on this. Where there are budget constraints it can be tempting to push ‘old’ reagents beyond their ‘best by’ date to save money. This can cost more in the long term. 
  • Lab records and notebooks should be carefully and correctly filled out. These are legal documents for most institutions and cannot be removed from the lab. Electronic lab notebooks are on the rise, and allow protocols and methods to be aligned with the final published data where institutional audits are necessary. While random audits and spot-checks are commonplace in industry, this has yet to be widely implemented in academic institutions.
  • Report methods in detail. Where journals have a word count requirement, additional details of protocols and methods can be included in Supplementary Information, which most journals have no word count limitations on. This will go a long way to others replicating your experiments.
  • Publish negative findings. A considerable amount of blame for the bias in scientific literature can be laid at the feet of journals that preference positive findings. It should also be the responsibility of investigators to adequately address unexpected negative findings, without trying to put a positive spin on them. A respondent of a recent survey by Nature reported that he expected rejection of a manuscript outlining why a technique had failed, and suggests that the reviewers accepted his paper because he offered a solution to the problem. One of the best papers I wrote was a replication study using well-curated data from the largest multi-centre study to date, and very robust analysis. We made good use of the Supplementary Information to detail all methods and results. Our findings refuted other positive claims, and an Editorial by the journal confirmed that the study was sufficiently robust to conclude that our negative findings settled the question of clinical relevance.
  • Pre-registration of a priori hypotheses. This has been one of the most publicized recommendations to improving reproducibility. It involves submitting an a priori hypothesis and analysis approach to a third party, before undertaking experiments, to guard against the temptation to pick potentially false positives that were not part of the pre-specified action plan.
  • Robust study design and analysis approaches. Much has been said about this, and it continues to be one of the most vexing issues in both lab and computational research. Approximately 90% of respondents to the Nature survey ranked “more robust experimental design”, including blinding and experimental controls where possible, and “better statistics” higher than institutional incentives for improving reproducibility.

Scientific integrity and good practice in academic research is generally assumed to be the responsibility of individual lab heads and lead investigators whose career or funding is potentially at stake. Ultimately, their failure can tarnish the institute’s reputation, as funding bodies increasingly publicize institutional-based summaries of overall funding and achievements. Good scientific practice should therefore be cultured at the institutional level, and a system of guidance and compliance can be developed and formalized both within and across institutes.  Ultimately there needs to be a complete shift in culture by all stakeholders including investigators, institutions, funding bodies and journals, of rewarding best practice. 

Academic metrics need to be devised that distinguish citations of discredited claims so that it is not more advantageous to state and retract a result than to make a solid discovery.” (Jan Conrad, Nature Comment 2015) 

At SugarApple Communications we can help you find the best way to communicate with your intended audience and assist with writing, editing and statistics. Get in touch today and let’s talk.

← Back to All Articles

Unfog the science…ensure quality, clarity and accuracy.

CONTACT US

Oops! We could not locate your form.

Request a fitting

Oops! We could not locate your form.

Close