Empathy and Grit - Not Just Publication Records - Should Be Considered in Researcher Assessment
Critics of current methods for evaluating researchers’ work say a system that relies on bibliometric parameters favours a ‘quantity over quality’ approach, and undervalues achievements such as social impact and leadership.
What's Wrong with the H-Index, According to Its Inventor
Love it or hate it, the H-index has become one of the most widely used metrics in academia for measuring the productivity and impact of researchers. But when Jorge Hirsch proposed it as an objective measure of scientific achievement in 2005, he didn’t think it would be used outside theoretical physics.
Metrics of Inequality: The Concentration of Resources in the U.S. Biomedical Elite
Academic scientists and research institutes are increasingly being evaluated using digital metrics, from bibliometrics to patent counts. These metrics are often framed, by science policy analysts, economists of science as well as funding agencies, as objective and universal proxies for scientific worth, potential, and productivity.
Citation metrics have value because they aim to make scientific assessment a level playing field, but urgent transparency-based changes are necessary to ensure that the data yields an accurate picture. One problematic area is the handling of self-citations.
Proposed Schema Changes - Have Your Say - Crossref
The first version of our metadata input schema (a DTD, to be specific) was created in 1999 to capture basic bibliographic information and facilitate matching DOIs to citations. Over the past 20 years the bibliographic metadata we collect has deepened, and we've expanded our schema to include funding information, license, updates, relations, and other metadata. Our schema isn't as venerable as a MARC record or as comprehensive as JATS, but it's served us well.
A Turning Point is a Time for Reflection - Crossref
Crossref strives for balance. Different people have always wanted different things from us and, since our founding, we have brought together diverse organizations to have discussions-sometimes contentious-to agree on how to help make scholarly communications better. Being inclusive can mean slow progress, but we've been able to advance by being flexible, fair, and forward-thinking. We have been helped by the fact that Crossref's founding organizations defined a clear purpose in our original certificate of incorporation, which reads:
A Standardized Citation Metrics Author Database Annotated for Scientific Field
Citation metrics are widely used and misused. This Community Page article presents a publicly available database that provides standardized information on multiple citation indicators and a composite thereof, annotating each author according to his/her main scientific field(s).
Citation Gaming Induced by Bibliometric Evaluation: A Country-level Comparative Analysis
Article proposes a new inwardness indicator able to gauge the degree of scientific self-referentiality of a country. A comparative analysis of the trends for the G10 countries in the years 2000-2016 reveals a net increase of the Italian inwardness.
A Call for Funders to Ban Institutions That Use Grant Capture Targets
Grant capture is often used as a formal metric for academic evaluation. The author argues that this practice has led to perverse incentives for researchers and institutions and that research funders have both a responsibility and a significant interest in using their influence to halt this practice.
This paper analyses usage statistics, citation data and altmetrics from a university press publishing open access monographs. The data suggests, despite the small sample, that authors can to a greater extent influence how their book is discovered by the readership.
How Journals and Publishers Can Help to Reform Research Assessment
It is well established that administrators and decision-makers use journal prestige and impact factors as a shortcut to assess research. But it is not enough to recognize the problem. Identifying specific approaches that publishers can take to address these concerns really is key.
The "Impact" of the Journal Impact Factor in the Review, Tenure, and Promotion Process
The Journal Impact Factor has been widely critiqued as a measure of individual academic performance. However, it is unclear whether these criticisms and high profile declarations, such as DORA, have led to significant cultural change.
The Times Higher Education University Impact Rankings assess universities against the United Nations' Sustainable Development Goals. Calibrated indicators are used to provide comprehensive and balanced comparisons across three broad areas: research, outreach, and stewardship. This first edition includes more than 450 universities from 76 countries.