The cobra effect, Indonesian lecturers are obsessed with the Scopus index and despicable practices towards world-class universities


President Joko Widodo often touched the performance of the Ministry of Research, Technology and Higher Education (Kemristekdikti) regarding the competitiveness of universities in Indonesia which he considered to be unsatisfactory.

Lastly, last October, Jokowi was surprised why only three universities (PTs) had succeeded in entering the world’s top 2018 universities in 2018 version of Quacquarelli Symonds (QS) . The President questioned the management of universities that were unable to respond to global demands.

In the midst of the university’s internationalization policy being promoted, the Research Ministry recently released findings of a number of violations of publication ethicsconducted by researchers, managers of scientific periodicals and PTN managers.

Ethics violations discovered publication includes multiple publications, citations of his own works or self-citation that is not natural, and the policy of publishing scientific papers without a disciplined review process.

The Research Ministry’s Credit Score Assessment Team (PAK), for example, found that a researcher could produce 69 scientific papers and 239 citations in a year, so there was a suspicion that the researcher cited his own work in an unnatural way.

These violations were carried out by the lecturers to ‘polish’ performance achievements so they could go international.

Seeing this trend, I feel that the policies that have been implemented by the Ministry of Research on internationalization of universities need to be re-evaluated.

Problematic ranking system

QS ranking is actually only one of several global rating agencies that is used as a reference by the Ministry of Research to evaluate the performance of PT. Until now, I have not found an official policy document that the public can access regarding the reason why this ministry has chosen QS, not the Academic Ranking of World University (ARWU), Times Higher Education (THE) or Round University Rankings (RUR), as a reference for performance.

QS compiles its ranking with very complex methods with six indicators of assessment . The two indicators with the highest weight are (1) review of academic colleagues ( academic peer review ) and (2) number of citations per faculty . Other indicators (3) comparison of the number of lecturers and students, (4) the reputation of the reputation of employers, (5) the ratio of the number of foreign students and (6) the ratio of staff to foreign educators.

Even though the method applied by QS was very controversial to the point where it drew sharp criticism , the Research Ministry and several PTs used these indicators as a basis for developing higher education development strategies.

Academic peer review is a problematic indicator when operationalized. To measure this indicator, QS surveyed the reputation of the university by asking academics to mention 10 local universities and 30 international universities which they considered the most reputable in the disciplines they studied. Therefore, QS not only ranks PT in general, but also ranks it based on specific disciplines .

This method is very doubtful , because the non-response rate is quite high. For example, for academic respondents from Asia, Australia and New Zealand , the response rate ( response rate) of the survey is not up to 50%.

In addition, in giving an assessment, respondents were not given any information about the PT they were evaluating, so there was a possibility that the results of the assessment would be contaminated with halo effects or biased impressions of certain university big names. Again, this problematic method may not provide real information about the quality of higher education, but rather reinforces the notion that the measurement is contaminated with cognitive bias.

Crazy about Scopus

The odd part of this ranking is the component of the number of citations of scientific work. QS absorbs the number of citations from Scopus , Elsevier’s commercial library database. Universities are competing to increase the number of indexed documents and the number of citations in Scopus as much as possible so that the score in this aspect can be optimal. This is the reason why Indonesian academics have become “infatuated” with Scopus.

Not enough there, the Research Ministry also created a research performance and publication measurement system called the Science and Technology Index (SINTA) score. The SINTA score is calculated from the number of documents and citations recorded by Google Scholar and Scopus, but gives far greater weight to documents and citations recorded by Scopus.

Based on the SINTA score, the Research Ministry compiles rankings of researchers (individuals), PTs, and even scientific periodicals. The dependence of ministries on bibliometrics such as h-index (productivity index and citations) and SINTA scores in the long run actually makes the development of science and innovation stagnate .

This is because this system does not actually encourage researchers to improve the quality of their work, but instead leads to excessive obsession in pursuing promotions and incentives to sacrifice their own integrity .

No wonder the German Science Foundation  prohibits the evaluation of bibliometric -based researchers’ performance even since 2013.

The effects of cobra and citations are unnatural

The anecdote to the effects of cobra comes from stories in the period of British colonization in India. The colonial government, which at that time was concerned by the increasing population of wild cobras in Delhi, offered wages for anyone who succeeded in hunting down cobras. This strategy seemed successful at first, because after that people flocked to bring dead cobras to get wages.

But shortly after the colonial government found that many citizens were cheating, who deliberately breed cobras to get wages, the competition was finally stopped. Soon, the wild cobra population climbed dramatically, because no one wanted to hunt cobras.

In the context of scientific publications, the effect of the cobra above is similar to the citation selfie phenomenon which the researcher deliberately conducted to hoist h-index . This makes the h-index provide misleading information about the reputation of the researcher, because the citation should occur naturally.

Internationalization policy does provide a loose space for abuse of power. Managers of scientific periodicals will be tempted to require the author to cite the manuscript that was previously published in the issue. Lecturers will be tempted to press students to include their names as authors of scientific articles or cite their writings, so that h-index increases.

Although this kind of practice has been banned by the Committee on Publication Ethics (COPE), which is the editorial association and publisher of the largest scientific periodicals in the world, the reality of this abuse of power is often found in various universities.

This is not a typical problem that occurs in Indonesia. Robert J. Sternberg, professor of developmental psychology at Cornell University, was recently sued for more or less the same problem. Amazingly, in one article, Sternberg cited 351 references, 161 of which were his own writing . Some of Sternberg’s writings were withdrawn from publishing, because he was discovered plagiarizing his own work that had been published previously.

As a result, Sternberg was forced to give up his position as editor in chief of Perspective on Psychological Science , a well-known journal published by the Association of Psychological Science(APS). This embarrassing incident made him lose face, even though Sternberg was an important figure behind a very popular theory, namely, the Triangle of Love and the Three Aspects of Intelligence .

Ministry Findings Research on the citation of the work itself is an early symptom of the cobra effect of the policies they implement themselves. Efforts to boost the number of publications that are too dependent on bibliometric measurements and the provision of incentives, both in the form of promotion and additional income, actually create opportunities for excessive justification effects .

In social psychology, the effect of excessive justification occurs when individuals are motivated by extrinsic rewards for doing a task. As a result, individuals will respond to tasks with a very efficient strategy, but the performance produced tends to be bad. To maintain the motivation of individuals to continue to focus on the task, incentives must always be available. When incentives are lost, individual interests will be lost.

The Research Ministry should not be too surprised if the PT and researchers do everything they can to look cool. Too focused on measurement makes us obsessed with cosmetic matters, while forgetting the more substantial aspects. Integrity and academic independence, for example.

Open science as a solution

I believe the credibility and value of information from a study lies in the integrity of the researcher . Integrity will appear from the commitment of researchers to conduct research in a transparent and open manner.

For this reason, I propose a system of open science -based researchers’ performance appraisal , a practice that requires researchers to make all their material and research findings into public resources. The Center for Open Science (COS), a non-profit organization that advocates open science practices, publishes Transparency and Openness Promotion (TOP) Guidelines for researchers, universities and managers of scientific periodicals who wish to adopt open science principles.

Researchers who openly report potential conflicts of interest, the results of ethical research feasibility tests, initial assumptions including research hypotheses, data retrieval plans, sample number planning as well as rules for terminating data collection, participant research recruitment strategies, raw data, until data analysis procedures, are given awards which is better than researchers who refuse to do the above.

Open science-based policies are now a trend since the replication crisis was widely discussed in 2011. The European Union is even more tactical by arranging Plan S as a reference to evaluate the research they are sponsoring.

Some journals with high impact factors also provide science badges open to researchers who pre-register , share raw data and analytical procedures with the public.

I believe that in the research ecosystem in Indonesia to be progressive and dynamic, then researchers and PT managers must adopt the principle of transparency. At present, only two choices are available – the science is open or is completely left behind.

Author Bio: Rizqy Amelia Zein is an Assistant Lecturer in Social and Personality Psychology at Airlangga University