Drug Discovery
References 1 Freedman, LP, Cockburn, IM, Simcoe, TS. 2015. The Economics of Reproducibility in Preclinical Research. PLoS Biol 13(6): e1002165. doi:10.1371/journal.pbio. 1002165. 2 Errington, TM, Iorns, E, Gunn, W, Tan, FE, Lomax, J, Nosek, BA. 2014. An Open Investigation of the Reproducibility of Cancer Biology Research. eLife 3: e04333.
doi:10.7554/eLife.04333. 3 Glenn, BC and Ellis, LM. 2012. Drug Development: Raise Standards for Preclinical Cancer Research. Nature 483(7391): 531–33. doi:10.1038/483531a. 4 Prinz, F, Schlange, T, Asadullah, K. 2011. Believe It or Not: How Much Can We Rely on Published Data on Potential Drug Targets? Nat Rev Drug Discov 10(9): 712– 712. doi:10.1038/nrd3439-c1. 5 Chronscinski, D, Cherukeri, S, Tan, F, Perfito, N, Lomax, J, Iorns, E. 2015. Registered report: the androgen receptor induces a distinct transcriptional program in castration-resistant prostate cancer in man. PeerJ 3:e1231
https://doi.org/10.7717/peerj. 1231. 6 Shan, X, Danet-Desnoyers, G, Fung, JJ, Kosaka, AH, Tan, F, Perfito, N, Lomax, J, Iorns E. 2015. Registered report: androgen receptor splice variants determine taxane sensitivity in prostate cancer. PeerJ 3:e1232
https://doi.org/ 10.7717/peerj.1232. 7 Iorns, E, Gunn, W, Erath, J, Rodriguez, A, Zhou, J, Benzinou, M. The Reproducibility Initiative, 2014. Replication Attempt: “Effect of BMAP-28 Antimicrobial Peptides on Leishmania Major Promastigote and Amastigote Growth: Role of Leishmanolysin in Parasite Survival” PLoS One 9(12): e114614. doi:10.1371/journal. pone.0114614.
Reports were submitted to eLife for peer review. The average time between the initial submission and acceptance of each Registered Report was 92 (±17) days (Table 1). We identified 11 service providers to perform the experimental work for the seven studies. These seven labs performed between two and five exper- imental services for each replication study. There was often a delay between the time that the proto- cols were approved with Peer Review and when the replicating lab received all of the necessary materials and reagents to begin experimental work. Once experimental work began, projects lasted, on average, 192 (±84) days (approximately 6.5 months). Six of these replication studies included an in vivo experimental component. The greater than six-month experimental time period with these seven replications included delays such as optimisation for the timing of IVIS imaging of tumours8 and confirming the expression of muta- tions9. The costs of the first five replications ranged between $11,700 to $65,940, averaging $33,700 (Table 1).
Discussion and conclusions
While other authors have helped to sound the alarm about irreproducible research, the RP:CB project represents the first study of its kind to make replication results open to the scientific com- munity. This open dataset allows each community member to re-analyse data, evaluate the quality control checks that were performed and vet the conclusions for themselves in a very tangible way. This degree of open access is unique for published research in biological sciences.
Continued on page 63 62
It is important to highlight that these are the first of several replications to be published. The RP:CB Core Team will perform a meta-analysis of all of the final reports to identify the factors that are associated with both reproducible and irreproducible studies. Preliminary results sug- gest that reproducibility can be hampered by lack of detail in specifying materials and methods of the original studies. For example, exact condi- tions for compound synthesis, precise protocols for xenograft injection, detailed methods for tis- sue homogenisation/extraction, conditions for western blotting and detailed conditions for reverse transcription-polymerase chain reaction (RT-PCR) if omitted from the original manuscript must be left to the discretion of the replicating lab. Solutions for the time and barri- ers to conducting replications include having raw data and full protocols available with the publi- cation, so authors following up on the finding
are not required to request this information. As well, depositing unique reagents that are not commercially available in repositories such as Addgene, JAX and ATCC would reduce barriers for follow-on studies as well as replication stud- ies. Lastly, introducing efficiencies that can speed up the peer review process can avoid delays dur- ing publication.
In addition to the forthcoming meta-analysis of factors contributing to reproducibility, RP:CB will publish a study highlighting the timing, costs and roadblocks that were encountered during the pro- cess for the entire RP:CB.
These studies show that the important work of replication needs to be done and should be an automatic result of any exciting new finding. It is clear from the first set of replications that the results were not always aligned with the original publications. Given that most of the high impact studies included in the RP:CB likely represent many years of trial and error and optimising pro- tocols, the costs of conducting replications are a small fraction of the grant awards that likely fuelled the original papers. The costs associated with these replications are reasonable estimates for funding agencies to use for studies of this scope as they begin to make replication plans a required component of grant applications. The NIH issued a notice in 2015 that the “NIH and AHRQ plans to require formal instruction in sci- entific rigour and transparency to enhance repro- ducibility for all individuals supported by institu- tional training grants, institutional career develop- ment awards, or individual fellowships” in early 2017 (NIH NOT-OD-16-034). This notice falls short of requiring evidence of independent replica- tion for grant applications. In order for replication studies to become part of the framework of sci- ence, funding needs to be allocated so that princi- pal investigators are able to include independent replication studies as part of their ongoing research programmes. Finally, the ultimate result of irreproducible pre- clinical studies is very low follow-on success of clinical trials in which human subjects volunteer to participate. The success rate of company-spon- sored, FDA registration-enabling development programmes progressing from Phase I to FDA approval between 2006 and 2015 was only 9.6% for all disease areas, and for oncology the success rate was even lower (5.1%10). This number is lower than in a previous report by Hay et al11 that measured a 10.6% overall success rate and 7% for oncology drugs. The important and compelling by-product of more robust and reproducible pre-
Drug Discovery World Winter 2017/18
Page 1 |
Page 2 |
Page 3 |
Page 4 |
Page 5 |
Page 6 |
Page 7 |
Page 8 |
Page 9 |
Page 10 |
Page 11 |
Page 12 |
Page 13 |
Page 14 |
Page 15 |
Page 16 |
Page 17 |
Page 18 |
Page 19 |
Page 20 |
Page 21 |
Page 22 |
Page 23 |
Page 24 |
Page 25 |
Page 26 |
Page 27 |
Page 28 |
Page 29 |
Page 30 |
Page 31 |
Page 32 |
Page 33 |
Page 34 |
Page 35 |
Page 36 |
Page 37 |
Page 38 |
Page 39 |
Page 40 |
Page 41 |
Page 42 |
Page 43 |
Page 44 |
Page 45 |
Page 46 |
Page 47 |
Page 48 |
Page 49 |
Page 50 |
Page 51 |
Page 52 |
Page 53 |
Page 54 |
Page 55 |
Page 56 |
Page 57 |
Page 58 |
Page 59 |
Page 60 |
Page 61 |
Page 62 |
Page 63 |
Page 64 |
Page 65 |
Page 66 |
Page 67 |
Page 68 |
Page 69 |
Page 70 |
Page 71 |
Page 72