Inflated citations and metrics of journals discontinued from Scopus for publication concerns: the GhoS(t)copus Project
The citation count of journals discontinued for publication concerns increases despite discontinuation and predatory behaviors seemed common. This paradoxical trend can inflate scholars’ metrics prompting artificial career advancements, bonus systems and promotion. Countermeasures should be taken urgently to ensure the reliability of Scopus metrics both at the journal- and author-level for the purpose of scientific assessment of scholarly publishing.
Google Scholar, Microsoft Academic, Scopus, Dimensions, Web of Science, and OpenCitations' COCI: a Multidisciplinary Comparison of Coverage Via Citations
New sources of citation data have recently become available. Although these have been compared to the Web of Science (WoS), Scopus, or Google Scholar, there is no systematic evidence of their differences across subject categories. In response, this paper investigates citations found by these data sources to English-language highly-cited documents published in 2006 from 252 subject categories, expanding and updating the largest previous study.
Citations Systematically Misrepresent the Quality and Impact of Research Articles: Survey and Experimental Evidence from Thousands of Citers
Citations are ubiquitous in evaluating research, but how exactly they relate to what they are thought to measure is unclear. This article investigates the relationships between citations, quality, and impact using a survey with an embedded experiment.
This paper presents a simple model of the lifecycle of scientific ideas that points to changes in scientist incentives as the cause of scientific stagnation. It explores ways to broaden how scientific productivity is measured and rewarded, involving both academic search engines such as Google Scholar measuring which contributions explore newer ideas and university administrators and funding agencies utilizing these new metrics in research evaluation.
Citation metrics have value because they aim to make scientific assessment a level playing field, but urgent transparency-based changes are necessary to ensure that the data yields an accurate picture. One problematic area is the handling of self-citations.
Inferring the Causal Effect of Journals on Citations
Articles in high-impact journals are by definition more highly cited on average. But are they cited more often because the articles are somehow "better"? Or are they cited more often simply because they appeared in a high-impact journal?
Recognizing the world's most influential researchers of the past decade, demonstrated by the production of multiple highly-cited papers that rank in the top 1% by citations for field and year in Web of Science.
Citecorp: Working with Open Citations - ROpenSci - Open Tools for Open Science
citecorp is a new (hit CRAN in late August) R package for working with data from the OpenCitations Corpus (OCC). OpenCitations, run by David Shotton and Silvio Peroni, houses the OCC, an open repository of scholarly citation data under the very open CC0 license. The I4OC (Initiative for Open Citations) is a collaboration between many parties, with the aim of promoting "unrestricted availability of scholarly citation data". Citation data is available through Crossref, and available in R via our packages rcrossref, fulltext and crminer.
A Standardized Citation Metrics Author Database Annotated for Scientific Field
Citation metrics are widely used and misused. This Community Page article presents a publicly available database that provides standardized information on multiple citation indicators and a composite thereof, annotating each author according to his/her main scientific field(s).
OpenCitations is a scholarly infrastructure organization dedicated to open scholarship and the publication of open bibliographic and citation data as Linked Open Data using Semantic Web technologies, to the development of software tools and services that enable convenient access to these open data, and to community advocacy for open citations. This paper describes OpenCitations and its datasets, tools, services and activities.
The results of this study strongly suggest that when male and female authors publish articles that are comparably positioned to receive citations, their publications do in fact accrue citations at the same rate. This raises the question: Why would gender matter “everywhere but here”?
Why (almost) Everything We Know About Citations is Wrong: Evidence from Authors
Although citations and related metrics like the H-index are widely used in academia to evaluate research and allocate resources, the referencing decisions on which they are based are poorly understood. This paper investigates whether authors reference works that influenced them most or those they believe the readers will value most.