No raw data, no science: another possible source of the reproducibility crisis

Author : Tsuyoshi Miyakawa

A reproducibility crisis is a situation where many scientific studies cannot be reproduced. Inappropriate practices of science, such as HARKing, p-hacking, and selective reporting of positive results, have been suggested as causes of irreproducibility. In this editorial, I propose that a lack of raw data or data fabrication is another possible cause of irreproducibility.

As an Editor-in-Chief of Molecular Brain, I have handled 180 manuscripts since early 2017 and have made 41 editorial decisions categorized as “Revise before review,” requesting that the authors provide raw data.

Surprisingly, among those 41 manuscripts, 21 were withdrawn without providing raw data, indicating that requiring raw data drove away more than half of the manuscripts. I rejected 19 out of the remaining 20 manuscripts because of insufficient raw data.

Thus, more than 97% of the 41 manuscripts did not present the raw data supporting their results when requested by an editor, suggesting a possibility that the raw data did not exist from the beginning, at least in some portions of these cases.

Considering that any scientific study should be based on raw data, and that data storage space should no longer be a challenge, journals, in principle, should try to have their authors publicize raw data in a public database or journal site upon the publication of the paper to increase reproducibility of the published results and to increase public trust in science.

URL : No raw data, no science: another possible source of the reproducibility crisis

DOI : https://doi.org/10.1186/s13041-020-0552-2

Reproducible and transparent research practices in published neurology research

Authors : Trevor Torgerson, Austin L. Johnson, Jonathan Pollard, Daniel Tritz, Matt Vassar

Background

The objective of this study was to evaluate the nature and extent of reproducible and transparent research practices in neurology publications.

Methods

The NLM catalog was used to identify MEDLINE-indexed neurology journals. A PubMed search of these journals was conducted to retrieve publications over a 5-year period from 2014 to 2018.

A random sample of publications was extracted. Two authors conducted data extraction in a blinded, duplicate fashion using a pilot-tested Google form. This form prompted data extractors to determine whether publications provided access to items such as study materials, raw data, analysis scripts, and protocols.

In addition, we determined if the publication was included in a replication study or systematic review, was preregistered, had a conflict of interest declaration, specified funding sources, and was open access.

Results

Our search identified 223,932 publications meeting the inclusion criteria, from which 400 were randomly sampled. Only 389 articles were accessible, yielding 271 publications with empirical data for analysis.

Our results indicate that 9.4% provided access to materials, 9.2% provided access to raw data, 0.7% provided access to the analysis scripts, 0.7% linked the protocol, and 3.7% were preregistered.

A third of sampled publications lacked funding or conflict of interest statements. No publications from our sample were included in replication studies, but a fifth were cited in a systematic review or meta-analysis.

Conclusions

Currently, published neurology research does not consistently provide information needed for reproducibility. The implications of poor research reporting can both affect patient care and increase research waste. Collaborative intervention by authors, peer reviewers, journals, and funding sources is needed to mitigate this problem.

URL : Reproducible and transparent research practices in published neurology research

DOI : https://doi.org/10.1186/s41073-020-0091-5

 

Rethinking the Journal Impact Factor and Publishing in the Digital Age

Authors : Mark S. Nestor, Daniel Fischer, David Arnold, Brian Berman, James Q. Del Rosso

Clinical and experimental literature search has changed significantly over the past few decades, and with it, the way in which we value information. Today, our need for immediate access to relevant and specific literature, regardless of specialty, has led to a growing demand for open access to publications.

The Journal Impact Factor (JIF) has been a long-time standard for representing the quality or “prestige” of a journal, but it appears to be losing its relevance. Here, we define the JIF and deconstruct its validity as a modern measure of a journal’s quality, discuss the current models of academic publication, including their advantages and shortcomings, and discuss the benefits and shortcomings of a variety of open-access models, including costs to the author.

We have quantified a nonsubscribed physician’s access to full articles associated with dermatologic disease and aesthetics cited on PubMed. For some of the most common dermatology conditions, 23.1 percent of citations (ranging from 17.2% for melasma to 31.9% for malignant melanoma) were available as free full articles, and for aesthetic procedures, 18.9 percent of citations (ranging from 11.9% for laser hair removal to 27.9% for botulinum toxin) were available as free full articles.

Finally, we discuss existing alternative metrics for measuring journal impact and propose the adoption of a superior publishing model, one that satisfies modern day standards of scholarly knowledge pursuit and dissemination of scholarly publications for dermatology and all of medical science.

URL : https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7028381/

Change in Format, Register and Narration Style in the Biomedical Literature: A 1948 Example

Authors : Carlo Galli, Stefano Guizzardi

Scientific communication has evolved over time and the formats of scientific writing, including its stylistic modules, have changed accordingly.

Research articles from the past fit a research world that had not been taken over by the internet, electronic searches, the new media and even the science mass production of today and reflect a reality where scientific publications were designed to be read and appreciated by actual readers.

It is therefore useful to have a look back to what science looked like in the past and examine the biomedical literature from older archives because several features of those publications may actually harbor vital insights for today’s communication.

Maintaining a vivid awareness of the evolution of science language and modalities of communication may ensure a better and steadfast progression and ameliorate academic writing in the years to come.

With this goal in mind, the present commentary set out to review a 1948 scientific report by I.L. Bennett Jr, entitled “A study on the relationship between the fevers caused by bacterial pyrogens and by the intravenous injection of the sterile exudates of acute inflammation”, which appeared in the Journal of Experimental Medicine in September 1948.

URL : Change in Format, Register and Narration Style in the Biomedical Literature: A 1948 Example

DOI : https://doi.org/10.3390/publications8010010

Transparent, Reproducible, and Open Science Practices of Published Literature in Dermatology Journals: Cross-Sectional Analysis

Authors : J Michael Anderson, Andrew Niemann, Austin L Johnson, Courtney Cook, Daniel Tritz, Matt Vassar

Background

Reproducible research is a foundational component for scientific advancements, yet little is known regarding the extent of reproducible research within the dermatology literature.

Objective

This study aimed to determine the quality and transparency of the literature in dermatology journals by evaluating for the presence of 8 indicators of reproducible and transparent research practices.

Methods

By implementing a cross-sectional study design, we conducted an advanced search of publications in dermatology journals from the National Library of Medicine catalog. Our search included articles published between January 1, 2014, and December 31, 2018.

After generating a list of eligible dermatology publications, we then searched for full text PDF versions by using Open Access Button, Google Scholar, and PubMed. Publications were analyzed for 8 indicators of reproducibility and transparency—availability of materials, data, analysis scripts, protocol, preregistration, conflict of interest statement, funding statement, and open access—using a pilot-tested Google Form.

Results

After exclusion, 127 studies with empirical data were included in our analysis. Certain indicators were more poorly reported than others. We found that most publications (113, 88.9%) did not provide unmodified, raw data used to make computations, 124 (97.6%) failed to make the complete protocol available, and 126 (99.2%) did not include step-by-step analysis scripts.

Conclusions

Our sample of studies published in dermatology journals do not appear to include sufficient detail to be accurately and successfully reproduced in their entirety. Solutions to increase the quality, reproducibility, and transparency of dermatology research are warranted.

More robust reporting of key methodological details, open data sharing, and stricter standards journals impose on authors regarding disclosure of study materials might help to better the climate of reproducible research in dermatology.

URL : Transparent, Reproducible, and Open Science Practices of Published Literature in Dermatology Journals: Cross-Sectional Analysis

DOI : https://doi.org/10.2196/16078

Determinants of Article Processing Charges for Medical Open Access Journals

Author : Sumiko Asai

For-profit subscription journal publishers recently have extended their publishing range from subscription journals to numerous open access journals, thereby strengthening their presence in the open access journal market.

This study estimates the article processing charges for 509 medical open access journals using a sample selection model to examine the determinants of the charges.

The results show that publisher type tends to determine whether the journal charges an article processing charge as well as the level of the charge; and frequently cited journals generally set higher article processing charges. Moreover, large subscription journal publishers tend to set higher article processing charges for their open access journals after controlling for other factors.

Therefore, it is necessary to continue monitoring their activities from the viewpoint of competition policy.

DOI : http://dx.doi.org/10.3998/3336451.0022.103

bioRxiv: the preprint server for biology

Authors : Richard Sever, Ted Roeder, Samantha Hindle, Linda Sussman, Kevin-John Black, Janet Argentine, Wayne Manos, John R. Inglis

The traditional publication process delays dissemination of new research, often by months, sometimes by years. Preprint servers decouple dissemination of research papers from their evaluation and certification by journals, allowing researchers to share work immediately, receive feedback from a much larger audience, and provide evidence of productivity long before formal publication.

Launched in 2013 as a non-profit community service, the bioRxiv server has brought preprint practice to the life sciences and recently posted its 64,000th manuscript.

The server now receives more than four million views per month and hosts papers spanning all areas of biology. Initially dominated by evolutionary biology, genetics/genomics and computational biology, bioRxiv has been increasingly populated by papers in neuroscience, cell and developmental biology, and many other fields.

Changes in journal and funder policies that encourage preprint posting have helped drive adoption, as has the development of bioRxiv technologies that allow authors to transfer papers easily between the server and journals.

A bioRxiv user survey found that 42% of authors post their preprints prior to journal submission whereas 37% post concurrently with journal submission. Authors are motivated by a desire to share work early; they value the feedback they receive, and very rarely experience any negative consequences of preprint posting.

Rapid dissemination via bioRxiv is also encouraging new initiatives that experiment with the peer review process and the development of novel approaches to literature filtering and assessment.

URL : bioRxiv: the preprint server for biology

DOI : https://doi.org/10.1101/833400