Citations and Gamed Metrics: Academic Integrity Lost

Jaime A. Teixeira da Silva

Jaime A. Teixeira da Silva is an independent researcher in Miki-cho, Japan; [email protected]. The author thanks Prof. Panagiotis Tsigaris, Thompson Rivers University, Canada, for his valuable insights and feedback.


Author-based metrics (ABMs) and journal-based metrics (JBMs), as well as citations, are the dominant currency of intellectual recognition in academia. These metrics and citations continue to be dominated by the Clarivate Analytics Journal Impact Factor (JIF) and are still widely used as a surrogate to determine prestige, scientific credit, and influence.1 However, their dominance in this area needs to be rethought, especially in China, which accounts for approximately 19 percent of total scientific research papers globally.2 China, like several other countries, games the JIF for financial compensation in pay-to-publish schemes based on JIF-carrying journals.3 A competing metric, CiteScore, is also showing signs of being gamed.4 It is possible that CiteScore may have evolved, following Goodhart’s Law, as JIF’s relevance as an “academic” metric began to fray.5 The culture of metrics, especially the JIF, to measure success, reputation, and productivity, or JIF-based praise platforms such as Clarivate Analytics’ “Highly Cited Researchers” report inflate the problem because they elevate the perceived excellence of single individuals above all others, when in fact most publishing success is collaborative.6 The survival of many academics and journals in this publish-or-perish culture depends on such metrics, as authors seek journals with higher JIFs in which to publish their findings. In the gold open access model, this can result in higher article processing charges, which may be inflated to meet market demand.7 Journals can even manipulate the chances of boosting their JBM/JIF by encouraging or selecting certain types of contributions that contribute towards the JIF, or by manipulating titles, or even the selection of highly cited authors or areas in their references.8 Since federal or public funding is often based on the ranking of journals, the abuse of ABMs and JBMs lies beyond a lack of professionalism. Not only does JBM manipulation constitute academic misconduct, there is a widening discussion about the need to criminalize journals that game their JIF.9

Under which circumstances is self-citation by authors or journals considered to be excessive? This question lies at the core of whether the use of a citation is valid. Generally, if a citation is used in a scholarly manner to support a statement or a fact, then it is a valid academic citation. A reference that does not support a stated fact or that is used frivolously to support an unrelated statement is an invalid reference because its purpose is not scholarly. Moreover, if it is stacked alongside an excessive number of other references, it is an unscholarly and meaningless reference because its relevance is diluted. However, in “vanity” publishing culture, ideology is based on gamed metrics, so getting ahead of the competition may involve coercive citation or the manipulation of citations to seek a rank that is higher than the competition, at the individual (AMB) or journal (JBM) level. Such actions may distort metrics within an entire field of study.10 Citation manipulation, which is not a new phenomenon, can be modelled,11 similar to co-authorship and citation networks.12 Such mapping allows author-author, author-editor, editor-editor, author-journal and editor-journal relationships to be appreciated, and quantified.13

In some cases, author groups collude to cite each other, forming citation rings, also referred to as citation cartels, with the explicit purpose of boosting ABMs, such as the Hirsch (h)-index.14 Fister et al. defined a citation cartel slightly differently, as “groups of authors that cite each other disproportionately more than they do other groups of authors that work on the same subject.” The term “cartel” carries a negative connotation, of an anti-competitive nature, that may imply market control. Such collusion with the purpose of citation manipulation is not restricted to authors. Peer reviewers might request authors to excessively cite their work or editors might request authors to include citations to their journal, in order to boost ABMs or JBMs.15 Focusing on the issue of peer review manipulation in Elsevier journals, citation manipulation was found for 433 reviewers from a pool of 54,821 (0.79 percent) reviewers.16 Editors might also engage in author coercion,17 requesting authors to excessively cite their own (i.e., editors’) papers,18 inducing inflation of the h-index, which ranks an academic’s papers using Google-based citation counts.19 Ultimately, such actions can also distort (inflate) the metrics of the journals in which that editor’s papers were published.

There is not only a manipulative component to such requests and practices. There is also an unethical component because they are, or may be perceived to be, unfair and because excessive self-citation rates—for example, 40-67 percent in twenty-eight dental journals indexed in Clarivate Analytics’ Journal of Citation Reports(JCR) from 1997-2016,20 or ranging massively between 4-54 percent in ten plastic surgery journals indexed in JCR from 2009-2015—can bloat JIFs and the perception of a journal’s importance.21 In June of 2020, due to excessive self-citation, JCR suppressed the JIFs of several journals and issued editorial expressions of concern for several other journals,22 highlighting the risks that journals take when they employ JBMs as a performance rating.23

Some algorithms exist to detect citation manipulation or citation cartels, but these need to be refined and applied to large databases such as PubMed, Scopus, or CNKI. Authors or journals that have been found to engage in citation boosting and other coercive or manipulative actions to fortify ABMs, JBMs or citation rates should be punished, either by publishing bans, blacklisting, or removal from editorial boards, or delisting from indexing services or databases, since such actions can constitute academic misconduct.

Increasingly, JIF-carrying journals are having papers retracted. Those journals that benefitted from citations of their retracted papers need to have their JIF, or other JBMs, adjusted downwards. A self-citation index would adjust the h-index downward to discount the effect of self-citations on ABMs. However, that concept and its application may be fundamentally flawed if it fails to distinguish valid (i.e., a citation that supports a statement) from invalid (i.e., a citation with the only purpose of inflating an ABM) self-citations. Currently, ABMs and JBMs can be adjusted downward to account for self-citation, and theoretically, it would be possible to adjust these metrics to accommodate for retractions as well. What is now needed is a way to adjust ABMs and JBMs downward to accommodate for self-citations andretractions while simultaneously seeking to emphasize article-based metrics24 rather than JBMs. If JCR and journals that carry JBMs like the JIF or CiteScore would be willing to play the metrics game by these new rules, then perhaps there may be a more conscientious use of metrics, a closer scrutiny (by peer reviewers and editors) of cited references, a more responsible use of ABMs and JBMs without the need for a Declaration of Research Assessment,25 and ultimately more self-conscious and ethical publishing.26

 
 

1 Erin C. McKiernan, Lesley A. Schimanski, Carol Muñoz Nieves, Lisa Matthias, Meredith T. Niles, Juan P. Alperin, “Use of the Journal Impact Factor in Academic Review, Promotion, and Tenure Evaluations.” eLife 8(2019): e47338.

2 Rob Johnson, Anthony Watkinson, Michael Mabe, “The STM Report: An Overview of Scientific and Scholarly Publishing,” 5th edition, International Association of Scientific, Technical and Medical Publishers, The Hague, The Netherlands, 213 (October 2018). 

3 Jaime A. Teixeira da Silva, “Does China Need to Rethink its Metrics—and Citation-Based Research Rewards Policies?”Scientometrics 112, no. 3 (2017):1853-1857.

4 Jaime A. Teixeira da Silva, “CiteScore: Advances, Evolution, Applications, and Limitations.” Publishing Research Quarterly 36, no. 3 (2020):459-468.

5 Michael Fire, Carlos Guestrin, “Over-optimization of Academic Publishing Metrics: Observing Goodhart's Law in Action.” GigaScience 8, no. 6 (2019): giz053.

6 Jaime A. Teixeira da Silva, Sylvain Bernès, “Clarivate Analytics: Continued Omnia Vanitas Impact Factor Culture,”Science and Engineering Ethics 24, no. 1 (2018):291-297.

7 Shaun Yon-Seng Khoo, “Article Processing Charge Hyperinflation and Price Insensitivity: An Open Access Sequel to the Serials Crisis.” LIBER Quarterly 29, no. 1 (2019): 1-18.

8 Johno Breeze, “Dispatched From the Editor in Chief: Does the Impact Factor Have Any Real Relevance to Our Military Health Journal?” BMJ Military Health 165, no. 5 (2019): 307-309.

9 Charles F. Hickman, Eric A. Fong, Allen W. Wilhite, Yeolan Lee, “Academic Misconduct and Criminal Liability: Manipulating Academic Journal Impact Factors.” Science and Public Policy 46, no. 5 (2019): 661-667.

10 Deepak Juyal et al. “Impact Factor: Mutation, Manipulation, and Distortion,” Journal of Family Medicine and Primary Care 8, no.11 (2019): 3475-3479.

11 S.R. Goldberg, H. Anthony, T.S. Evans, “Modelling Citation Networks.” Scientometrics 105, no. 3 (2015): 1577-1604.

12 Ludo Waltman, Nees Jan van Eck, Ed C.M. Noyons, “A Unified Approach to Mapping and Clustering of Bibliometric Networks,” Journal of Informetrics 4, no. 4(2010): 629-635.

13 Aurora González-Teruel et al. “Mapping Recent Information Behavior Research: An Analysis of Co-Authorship and Co-Citation Networks,” Scientometrics 103, no. 2(2015):687-705.

14 Christoph Bartneck, Servaas Kokkelmans, “Detecting h-Index Manipulation Through Self-Citation Analysis,” Scientometrics 87, no.1 (2011): 85-98.

15 Brett D. Thombs et al. “Potentially Coercive Self-Citation by Peer Reviewers: A Cross-Sectional Study,” Journal of Psychosomatic Research 78, no.1(2015): 1-6; Jonathan D. Wren, Alfonso Valencia, Janet Kelso, “Reviewer-Coerced Citation: Case Report, Update on Journal Policy and Suggestions for Future Prevention,” Bioinformatics 35, no. 18 (2019): 3217-3218.

16 Jeroen Baas, Catriona Fennell, “When Peer Reviewers Go Rogue: Estimated Prevalence of Citation Manipulation by Reviewers Based on the Citation Patterns of 69,000 Reviewers,” ISSI 2019, September 2-5 2019, Rome, Italy.

17 Mark Chaplain, Denise Kirschner, Yoh Iwasa, “JTB Editorial Malpractice: A Case Report.” Journal of Theoretical Biology 488 (2020): 110171.

18 Richard Van Noorden, “Highly Cited Researcher Banned from Journal Board for Citation Abuse.” Nature 578 (2020): 200-201.

19 Claudiu Herteliu et al. “Quantitative and Qualitative Analysis of Editor Behavior Through Potentially Coercive Citations,”Publications 5, no. 2 (2017): 15.

20 Konstantina Delli, Christos Livas, Pieter U. Dijkstra, “How Has the Dental Literature Evolved Over Time? Analyzing 20 Years of Journal Self-Citation Rates and Impact Factors,” Acta Odontologica Scandinavica 78, no. 3 (2020): 223-228.

21 Shimpei Miyamoto, “Self-citation Rate and Impact Factor in the Field of Plastic and Reconstructive Surgery,” Journal of Plastic Surgery and Hand Surgery 52, no. 1 (2018): 40-46.

22 Clarivate Analytics’ Journal of Citation Reports, http://jcr.help.clarivate.com/Content/title-suppressions.htmhttp://jcr.help.clarivate.com/Content/editorial-expression-concern.htm (last accessed: September 10, 2020).

23 Clyde W. Holsapple, “Journal Self-Citation II: The Quest for High Impact: Truth and Consequences?” Communications of the Association for Information Systems 25 (2009): 2.

24 B. Ian Hutchins et al. “Relative Citation Ratio (RCR): A New Metric That Uses Citation Rates to Measure Influence at the Article Level.” PLoS Biology 14, no. 9 (2016): e1002541.

25 San Francisco Declaration on Research Assessment, https://sfdora.org/(last accessed: September 10, 2020).

26 John P.A. Ioannidis, Brett D. Thombs, “A User's Guide to Inflated and Manipulated Impact Factors.” European Journal of Clinical Investigation 49, no. 9 (2019): e13151.


Image: Campaign Creators, Public Domain

Recommended citation: Jaime A. Teixeira da Silva. "Citations and Gamed Metrics: Academic Integrity Lost.” Academic Questions 34, no. 1 (Spring 2021): Page 96–Page 99.

  • Share
Most Commented

September 18, 2024

1.

Understanding Anti-Semitism from a Neglected Film and Novel

A little known 1940 film starring Margaret Sullavan, Jimmy Stewart, and Frank Morgan is perhaps Hollywood’s most successful attempt to dramatize the nature of antisemitism and the Nazi......

Most Read

May 30, 2018

1.

The Case for Colonialism

From the summer issue of Academic Questions, we reprint the controversial article, "The Case for Colonialism." ...

October 18, 2023

2.

Did American Police Originate from Slave Patrols?

The claim that American policing “traces back” to, “started out” as, or “evolved directly from,” southern slave patrols, is false....

March 29, 2019

3.

Homogenous: The Political Affiliations of Elite Liberal Arts College Faculty

A study on the partisanship of liberal arts professors at America's top universities. ...