Simply knowing about open science is not enough to prevent unethical research behavior

Share:
  • Open science awareness alone is not enough to create an ethical research climate, especially without the support of an adequate research ecosystem.
  • Publication pressures and quantitative incentives often lead to unethical research practices.
  • A change in research culture based on quality and transparency is needed to create a healthy research ecosystem.

Awareness of open science does not automatically translate into ethical research principles. A collaborative effort is needed from researchers, educational institutions, and scientific publishers to promote open science so that it can be applied more widely.

While researchers recognize the importance of open science, awareness alone is not enough. A 2024 study showed that efforts to implement open science are indeed challenging. Many other factors contribute to the adoption of open science in our research ecosystem.

The analogy is similar to the awareness that corruption is bad. However, simply knowing and being aware that corruption is a bad act does not necessarily reduce corrupt behavior in society.

This awareness also requires a change in research culture that values ​​collaboration, transparency, and replication as equally as publication.

The academic incentive system must also shift, from assessing reputation based on quantity to quality, openness of methods, and verifiable scientific contributions.

Not just data transparency

Open science practices are not only pre-registration efforts , research data transparency, open access, and rejecting irresponsible research practices.

Furthermore, open science is based on the values ​​inherent in science. In this sense, scientific knowledge comprises four core values: universalism, communalism, impartiality, and skepticism or structured doubt.

These four values ​​can be realized if researchers incorporate open science practices into their research activities. For example, the value of skepticism in science must be upheld because every theory must be repeatedly tested through further research to accurately reveal the realities and dynamics in the field.

A 2011 study , for example, claimed precognition , or the ability to predict the future, in humans. Eight of the nine experiments in the study showed significant results ( p < 0.05).

However, replication attempts by other researchers failed to confirm their findings , despite using exactly the same procedure.

The differences between the two studies suggest that the initial findings were likely not a reflection of a real phenomenon, but rather a false positive result from the 2011 study. This arose due to statistical fluke or weaknesses in the study design.

Furthermore, initial studies can potentially contain analytical biases , such as selecting a particular statistical test or stopping data collection when it becomes significant. The results create the illusion of validity. Replication failure is not simply a methodological failure, but a signal that the initial findings are unstable and do not reflect empirical reality.

In other words, open science aims to realize the values ​​of universalism, communalism, impartiality, and skepticism in research practices in the field.

Trapped in the assessment system

Unfortunately, barriers to the implementation of open science also arise from scientific publishing bodies and research institutions, including universities.

In this context, researchers are both “victims” and “perpetrators.” They may understand the importance of open science, but they are not yet fully capable of implementing it consistently.

As victims, researchers are often held hostage by unwritten norms and rules that hinder the implementation of open science. One common ethical violation is data dredging , which involves increasing sample sizes or manipulating statistical models to obtain results that support a hypothesis.

This practice is prone to occur because scientific journals tend to publish research with significant findings or that proves a hypothesis. This is a phenomenon known as publication bias .

As a result, researchers manipulate sample sizes or statistical models to avoid publication failure.

Another barrier may arise because many research institutions and universities require a certain quantity of article publications for job promotions or incentives.

Promotional measures or incentives like these have the potential to make research easier and faster. A focus on research quantity (productivity), especially on speedy completion and concise publication, often comes at the expense of research quality and ethics.

As a result, researchers are required to produce many studies in a short time, and in the form of short, easy-to-digest articles ( bite-size science ) .

The problem here isn’t the concise format of the articles, but rather the spirit of their rapid production. This practice runs counter to the spirit of scientific credibility, which emphasizes thorough, well-documented research that is open to replication. Such research typically progresses more slowly.

What needs to be improved is not just the length of articles, but rather the orientation of the assessment system, from simply producing many quick publications to encouraging more in-depth, transparent, and trustworthy research.

The main focus is not the quantity of publications, but rather the contribution and impact of the research .

The importance of an open science research ecosystem

If the research ecosystem, both at the institutional and scientific publishing levels, does not provide adequate space for open science practices, then individual awareness of research ethics alone will not be enough.

We need fundamental changes in research norms and culture , including promoting incentive systems and academic promotions based on the quality of research and the application of open science, not just the quantity of publications or citation metrics alone .

2025 Nobel Prize in Medicine laureate Mary E. Brunkow proves that the number of publications is not always directly proportional to their scientific value and usefulness.

With just 34 well-researched publications, Mary Brunkow has successfully transformed the way we understand the human immune system .

Unfortunately, the obsession with productivity actually drives some researchers to take shortcuts in pursuit of academic reputation.

Author Bio: Abdul Hadi is a Junior Researcher at Atma Jaya Catholic University of Indonesia

Tags: