Journal List > J Korean Med Sci > v.33(28) > 1107860

See the article "".
Misra, Wakhlu, and Agarwal: Letter to the Editor: Individual Researcher and Author Metrics: a Viewpoint from India
We read with interest the article by Gasparyan et al.1 discussing currently available author-level metrics. The authors discussed indices for evaluating the impact of research conducted and published by an individual author. Even today, many academics equate the value of research conducted by an individual with the Impact Factor (IF) of the journal where it has been published. In doing so, they fail to understand the limitations of citation metrics such as the IF,2 which are meant to reflect the readership interest to the journal. Getting one's article into a high-impact journal does not mean that the article is of the same level as the average article of the journal. Instead, the number of citations garnered by the article over time reflects its true merit (not necessarily scientific quality).
As the authors rightly pointed out,1 indices such as the h-index, which reflects the number of articles which have garnered at least the number of citations equal to the index, are being increasingly recognized as valuable author-level metrics. The h-index can vary depending on which database it is derived from, and the index derived from Google Scholar may be misleading.1 Users of Google Scholar may have their account set to the default mode of including publications automatically, which can result in the unintentional addition of publications from authors sharing one or more initials with the author's name, which distorts author-level metrics. In this context, however, it is important to understand that reliable subscription databases which provide the h-index, such as Scopus and Web of Science (WoS),1 are limited in their accessibility to researchers from economically less-privileged regions of the world. Therefore, authors from these parts of the world often quote their h-index based on freely available Google Scholar. And they are advised to mention in their CVs the source of the h-index calculation (Scopus, Google Scholar, or other) and the date when it was last updated.
Another commonly used scholarly platform is ResearchGate.1 While it is rightly thought of as a social media interaction site for academia, one of the features it provides for users is author-level metrics. One useful feature that the ResearchGate h-index provides is an additional h-index excluding self-citations. This may help screen out excessive self-citations which artificially inflate one's own h-index. Another important consideration is the ResearchGate Score (RG score), which some academics cite in their CVs. One must exercise caution while using this metric, since the basis for its calculation is proprietary to ResearchGate (i.e., not publicly available), and is based not only on publications and their popularity, but also involvement in other social media activities such as asking and answering questions on the website.3
The use of author-level indices as criteria for academic promotions merits consideration. As an example, the prevalent criteria for the promotion of teachers in medical institutions in India, from assistant professor to associate professor or professor, require a certain number of years of experience as well as at least two publications at each stage as the first or corresponding author in indexed journals.4 There is a lack of clarity as to the quality of these publications or indexing agencies with which these articles are indexed, and this is a matter of ongoing debate.5 The authors are of the opinion that since the actual quality of published work is determined by the number of citations garnered over time, it might be worthwhile including author-level metrics such as the h-index in such criteria. It might be reasonable to expect a Scopus-based h-index of at least 10 for promotion to associate professor and 15 for promotion to a full professorship. This suggestion, however, needs to be tempered with other considerations. First, there should be a policy at the national level mandating universities and institutions to enable greater access to Scopus, which is the largest citation database. In developing countries like India, immense clinical patient load, lack of universally available basic funding and facilities for quality research, and poor career demarcations and prioritizations into clinical work/teacher/researcher and its combinations thereof need to be considered. The pressure of measuring up to certain metrics in medical institutions in developing countries must be tempered by the working realities and priorities of the region. Second, there is an unmet need of arousing greater awareness about such author and researcher level indices amongst academics, while warning them about the potential fallacies of a poorly maintained Google Scholar account (and its resultant indices) as well as over-reliance on indices generated from sites such as ResearchGate. Thirdly, academics should be made aware of the need to accurately list the source (therefore reflecting the reliability) of indices as the h-index listed on their CV. Another point worth considering is that not only research publications, but also involvement in other scientific activities such as peer reviewing manuscripts are important in the overall academic development of a researcher; therefore, it might be worthwhile devising and mentioning indices to reflect peer review activities derived from websites such as Publons.

Notes

Disclosure The authors have no potential conflicts of interest to disclose.

Author Contributions

  • Conceptualization: Misra DP, Wakhlu A, Agarwal V.

  • Data curation: Misra DP, Wakhlu A, Agarwal V.

  • Formal analysis: Misra DP, Wakhlu A, Agarwal V.

  • Investigation: Misra DP, Wakhlu A, Agarwal V.

  • Writing - original draft: Misra DP, Wakhlu A, Agarwal V.

  • Writing - review & editing: Misra DP, Wakhlu A, Agarwal V.

References

1. Gasparyan AY, Yessirkepov M, Duisenova A, Trukhachev VI, Kostyukova EI, Kitas GD. Researcher and author impact metrics: variety, value, and context. J Korean Med Sci. 2018; 33(18):e139.
crossref
2. Bornmann L, Marx W, Gasparyan AY, Kitas GD. Diversity, value and limitations of the journal impact factor and alternative metrics. Rheumatol Int. 2012; 32(7):1861–1867.
crossref
3. Orduna-Malea E, Martín-Martín A, Thelwall M, Delgado López-Cózar E. Do ResearchGate Scores create ghost academic reputations? Scientometrics. 2017; 112(1):443–460.
crossref
4. Minimum qualifications for teachers in medical institutions, medical council of India, gazette notification. Update 2017. Accessed April 22, 2018. https://www.mciindia.org/documents/e_Gazette_Amendments/TEQ-08.06.2017.pdf.
5. Bhaskar SB. The mandatory regulations from the Medical Council of India: Facts, opinions and prejudices. Indian J Anaesth. 2016; 60(11):793–795.
crossref
TOOLS
ORCID iDs

Durga Prasanna Misra
https://orcid.org/0000-0002-5035-7396

Anupam Wakhlu
https://orcid.org/0000-0003-4342-9547

Vikas Agarwal
https://orcid.org/0000-0002-4508-1233

Similar articles