Search: “publish or perish”, towards the end of a dogma?

Share:

Today, the value of a researcher is mainly based on the number of citations he receives in scientific articles. A set of indices have been established, in the light of this vision of performance, to objectify the scientific productivity of a researcher.

The best known of these was developed in 2005 by Jorge Hirsch: the h-index, or h index , measures both the volume of publications and the number of citations per publication.

Today it is the benchmark to define the rating of a researcher on the world market. However, we are currently witnessing the proliferation of new tools that are shaking up the established order by looking more broadly at the influence of a publication, beyond the circle of researchers alone.

With these alternative measures ( altmetrics ), would we leave the era of “publish or perish” , where a researcher, to be recognized, must imperatively write in journals only read by his peers?

The limits of the index h

The Hirsch index is in fact criticized for imposing a narrow view of research by considering, as a symptom of performance, only what takes place within a closed circle of scientific journals exclusively read by peers .

As soon as it is set up as a governance system, the h index also generates a set of significant biases .

A publication of excellence certainly strengthens the career of a researcher at the same time as it reinforces the prestige of his institution.

But the “race for the stars” – that is to say the injunction made to the researcher to publish ever more scientific articles in ever more elite journals – also leads (and above all) to an explosion in research budgets. , due to the inflation of the salary of the teacher-researchers),

The “race for the stars” also leads to a decoupling between research and teaching (the best teacher-researchers being mainly invested in publication to the detriment of pedagogy), and a phase shift with companies (the way of thinking and doing research in management being more suitable for publishers than managers). Ultimately, we arrive at a significant number of schools and universities which are advancing backwards, aware of the cost inflation and the after all negligible impact of these scientific publications.

Alternative measures

The exhaustion of a growing number of players faced with the tyranny of the h index, combined with the opportunities offered today by digital transformation, has favored the emergence of measures based on a more open approach. These altmetrics demand that new media be taken into account, with the aim of promoting research that primarily concerns civil society.

A range of operational tools ( Almetric , ImpactStory , etc.) allow you to follow the impact of a publication on social media (Twitter, LinkedIn, etc.), online media (The Conversation, etc.), blogs. scientists ( scilogs, …), bibliographic databases (Google Scholar, Crossref…), online reference managers (Mendeley, Zotero…), etc.

In summary, they mobilize all the potential of web 2.0. to evaluate the dissemination of a research work: number of views of the publication, downloads, comments, shares, mentions, “likes”, tags. The challenge is to measure the impact on the multiple stakeholders of a research institution: decision-makers, managers, journalists…

Limits of altmetrics

The alternatives to the index h are not without criticism. The main one being the inability to take into account the methodological quality of a scientific publication. Notoriety, commitment, buzz … do not guarantee the excellence of a contribution.

The almetriccs could on the other hand favor a communicator to the detriment of an expert, perhaps not to the point of placing Kim Kardashian above Mickael Porter, but we nevertheless understand the nature of the possible drifts. However, such a pitfall should be moderated insofar as only content benefiting from a DOI ( digital object identifier ) is eligible for monitoring measures.

We also know, from the actor-network theory ( Akrich, Callon and Latour , 1988) that the production of scientific facts involves a stage of translation and controversy.

In this respect, the web constitutes an interesting passage point to observe in order to understand the nature of the interactions that arise from a publication. Altmetrics, in other words, offer an alternative to traditional measurements by encompassing shock waves that spread beyond academic journals.

Towards a societal impact

The evolution of cybermetric statistics , added to the proliferation of articles , symposia and workshops which militate in favor of alternative measures, gives us a fairly eloquent indication of the shift that the evaluation of research is currently taking.

The large publishers were also not mistaken. Nature , Springer, Wiley, Taylor & Francis, etc. have already incorporated these new indicators into their scientific journals through more comprehensive dashboards. More than ever, a researcher will have to ensure that he produces relevant scientific content that captures the interest of all stakeholders.

The societal impact of the publication will also be more and more decisive in the long term. The ambition of the Business School Impact System (BSIS), initiated by the FNEGE and the EFMD , clearly fits into this approach. It is indeed the first tool to assess the impact of an academic institution which takes into account a set of crucial dimensions (financial, educational, intellectual, territorial, etc.).

The emergence of these new indicators will profoundly modify the process of knowledge creation (choice of themes, means of dissemination, mode of financing, operational purpose, etc.) with, as a result, a better understanding of the role and fundamental utility of the teacher-researcher.

Author Bios: Romain Zerbib is a Researcher associated with the IMEO Chair, ESSEC and Olivier Mamavi is a Teacher-researcher in management at Propedia

Tags: