Hindustan Times ST (Mumbai) - HT Navi Mumbai Live

Excellence in science can’t be reduced to just numbers

-

Last month, Stanford University released a list of the world’s top 2% of most-cited scientists. While media coverage in India focused mainly on the number of Indian scientists (52) who made it to the list, the idea of ranking research output remains a controvers­ial one in the scientific community as it often degenerate­s into a debate over the quality of research and creativity.

To understand these rankings, one must understand what they measure and if they correlate with what is perceived as scientific excellence. The rank assigned to scientists is based on how many other publicatio­ns cite their work. More citations imply that a particular work was noticed by peers and is a measure of its relevance and possible importance.

American informatio­n scientist Eugene Garfield pioneered the system using citations as the basis for ranking scientists when many new scholarly journals began publicatio­n after World War II. When journals turned digital in the 1990s, citation data became easily accessible and assigning citationsb­ased ranks gained popularity. One such metric, suggested in 2005 by physicist Jorge Hirsch, is the h-index (Hirsch index), which indicates a scientist’s productivi­ty and citation impact. The Stanford rankings are based on a composite index that considers six different indicators, including the h-index.

But are citations a good indicator of the quality of research? The evidence available is not compelling. The Nobel Prize in the sciences and the Fields Medal in mathematic­s are widely perceived as representi­ng the highest standards of research excellence. The Stanford-led group examined the citations of 47 Nobel Laureates (2011-2015). Of them, only 15 would get the top rank if the criteria is the total number of citations; 18, if the h-index is used, and 37 if the composite index is used.

Since the citation volume and practices vary widely across fields, any mechanism that uses citations alone can be misleading. For instance, none of the Nobel Prize laureates in 2022 or Fields Medal winners could secure a rank of less than 1,000. Though a Fields Medallist entered the list at 1,023, many other winners did not even figure on the list. Moreover, the top 500 ranks were primarily occupied by biomedical scientists, an understand­able skew from complete reliance on citation metrics. If this well-meaning attempt at quantifyin­g research is fraught with such pitfalls, then we must be careful not to interpret the ranks to necessaril­y imply scientific excellence.

This exercise has also unwittingl­y brought to light other sordid issues, such as scientists artificial­ly inflating citations or riding on excessive self-citations to game the system. The Stanford data tried to mitigate this problem by giving another ranking chart, disregardi­ng self-citations. These fixes help somewhat in acting against unethical practices but cannot curb the inappropri­ate use of citationba­sed metrics.

Since 2005, many universiti­es and funding agencies worldwide have been accused of using the h-index and journal impact factors to evaluate applicants for academic positions or research grants instead of a critical evaluation by experts. In 2012, a group of journal editors and publishers initiated the San Francisco Declaratio­n of Research Assessment, calling upon the community “to eliminate the use of journal-based metrics... in funding, appointmen­t, and promotion considerat­ions” and emphasised the “need to assess research on its own merits rather than on the basis of the journal in which the research is published”. In 2014, India’s department of science and technology supported this declaratio­n: Quantitati­ve assessment of research outputs cannot solely measure scientific creativity. It is, at best, one of the many facets that contribute to a researcher’s profile. If statistica­l indicators are the only criteria for excellence, then Sachin Tendulkar, with a test batting average of 53.78 and ranked 23 in the list of highest career batting average in Test matches, would not be celebrated as one of the greatest cricketers ever.

In science, as in sports, excellence cannot be reduced to just numbers. The Stanford group’s ranking list will be meaningful if it is read, keeping in mind the warning given by its authors: “All citation metrics have limitation­s and their use should be tempered and judicious”.

 ?? AP ?? The Nobel Prize in the sciences and the Fields Medal in mathematic­s are widely perceived as representi­ng the highest standards of research excellence. The Stanford-led group examined the citations of 47 Nobel Laureates (2011-15). Of them, 15 would get the top rank if the criteria is the total number of citations
AP The Nobel Prize in the sciences and the Fields Medal in mathematic­s are widely perceived as representi­ng the highest standards of research excellence. The Stanford-led group examined the citations of 47 Nobel Laureates (2011-15). Of them, 15 would get the top rank if the criteria is the total number of citations
 ?? ?? MS Santhanam
MS Santhanam

Newspapers in English

Newspapers from India