Open in order to: Optimize scientific research process

To coincide with the OpenCon London satellite event on 21st November, Know-CenterDigital Science and ScienceOpen are excited to team up with Authorea to announce an essay competition for short blog posts on this year’s Open Access Week theme of “Open in order to …”. This theme is an invitation to answer the question of what concrete benefits can be realized by making scholarly research outputs openly available. “Open in order to…” serves as a prompt to move beyond talking about openness in itself and focus on what openness enables—in an individual discipline, at a particular institution, or in a specific context; then to take action to realize these benefits. The following essay is my submission to Authorea’s prompt.
Keywords: open access, open peer review, open science, and open methods.
Originally posted on Authorea.

Often when we think about Science qua Science, it’s an inspirational pursuit of knowledge and the foundation of human progress and prosperity. Looking around at the advances in technology and modern medicine, it is truly amazing that this system for exploration that has uncovered gravitational waves, decoded the human genome, and has made possible the life that we live through technology and medicine.

While maintaining this sense of importance and awe, scientists themselves have a unique perspective perhaps contrasting the positive aesthetic due to its day-to-day implementation. The research process is a slow one with failing experiments, replicated studies, and detailed protocols needing development or optimization. However disheartening, it’s an unavoidable part of how scientific discoveries are made. But science is not an isolated system, and researchers rely heavily on the past bodies of work to glean insight into how a new system functions, follow up on past findings, or mimic a particular protocol. Frequently reading or skimming tens to hundreds of articles while planning an experiment or writing a manuscript, the attributes of published scholarly literature is the cornerstone of doing science. Having the techniques and data analysis procedures explicit, along with the subsequent critique from the larger science community (especially peer reviewer comments), would optimize the process of science itself at a granular but compoundingly massive scale.

Online journals have allowed for the acquisition of massive amounts of information and opened the space for extensive detail within a publication (however traditional) along with appending a supplemental information, typically found separate to the main article. Despite this, many details and specifics about the scientific procedure remain lacking. Instead, these sections typically contain peripheral experiments that don’t have enough wow-factor to make it in the main body. These are important but do not always highlight methodology or data analysis procedures.

Methods sections themselves should describe the actions taken to investigate the system of interest. They need to address research design, experimental protocols, preparation of sample, instrumentation, and data analysis procedures. Along with this, the justification and rationale of why these protocols and analyses were specifically chosen should be included to round out and fill in the context and relate them to the whole study.

The type of content should be as explicit as possible such that another researcher can follow instructions and, provided the method was fully transparent and both researchers did it correctly, to ensure that the same conclusions can be made. However, if the data from the two replicate experiments does not match, instead of assuming inherent flaws in the enterprise of science, it means the system of interest may be more complicated than anticipated. In this case, there’s an unknown part within the experiment that is not being controlled for –  luckily scientists are trained to be curious and determine what’s going on. But the credibility of these further studies rests initially upon the transparency of the methodology.

Similarly, the subsequent overt and visible critique by the wider scientific community on the methods, data, and analysis within a published product would further bolster the credibility and optimization of the scientific process. When determining if a study is useful, convincing, and ultimately worthwhile, there is an element of critical thinking required on the reader’s part but the backing of the rest of the community would further support the decision. A metric that’s commonly used to determine the usefulness of a research study is the number of citations. The reuse of a particular citation is sort of an inherent form of peer review, as it suggests that the broader group finds it valuable. But this form of indirect validation does not address any  explicit critiques or impacts of the particular study that inform subsequent experimentation.

Providing a space alongside the publication is a potential solution that allows for open, honest dialog with the submitted manuscript. In the current infrastructure, this looks something like open peer review where, along with the paper itself, viewers could also see the evaluations the experts had on first glance of the manuscript and determine if critiques were addressed. The addition of this, without any extra labor on the reviewer’s end, allows scientists to analyze the paper themselves to gain additional insight into the work, further influencing the future of their experimentation and of science itself. Exposing and adding transparency to the most fundamental processes of science – the methodology and the analytical critique – will serve key roles in bettering future research practices as a result.


Leave a Reply

Your email address will not be published. Required fields are marked *