Proposed Schema Changes - Have Your Say - Crossref
The first version of our metadata input schema (a DTD, to be specific) was created in 1999 to capture basic bibliographic information and facilitate matching DOIs to citations. Over the past 20 years the bibliographic metadata we collect has deepened, and we've expanded our schema to include funding information, license, updates, relations, and other metadata. Our schema isn't as venerable as a MARC record or as comprehensive as JATS, but it's served us well.
A Turning Point is a Time for Reflection - Crossref
Crossref strives for balance. Different people have always wanted different things from us and, since our founding, we have brought together diverse organizations to have discussions-sometimes contentious-to agree on how to help make scholarly communications better. Being inclusive can mean slow progress, but we've been able to advance by being flexible, fair, and forward-thinking. We have been helped by the fact that Crossref's founding organizations defined a clear purpose in our original certificate of incorporation, which reads:
A Standardized Citation Metrics Author Database Annotated for Scientific Field
Citation metrics are widely used and misused. This Community Page article presents a publicly available database that provides standardized information on multiple citation indicators and a composite thereof, annotating each author according to his/her main scientific field(s).
Citation Gaming Induced by Bibliometric Evaluation: A Country-level Comparative Analysis
Article proposes a new inwardness indicator able to gauge the degree of scientific self-referentiality of a country. A comparative analysis of the trends for the G10 countries in the years 2000-2016 reveals a net increase of the Italian inwardness.
A Call for Funders to Ban Institutions That Use Grant Capture Targets
Grant capture is often used as a formal metric for academic evaluation. The author argues that this practice has led to perverse incentives for researchers and institutions and that research funders have both a responsibility and a significant interest in using their influence to halt this practice.
This paper analyses usage statistics, citation data and altmetrics from a university press publishing open access monographs. The data suggests, despite the small sample, that authors can to a greater extent influence how their book is discovered by the readership.
How Journals and Publishers Can Help to Reform Research Assessment
It is well established that administrators and decision-makers use journal prestige and impact factors as a shortcut to assess research. But it is not enough to recognize the problem. Identifying specific approaches that publishers can take to address these concerns really is key.
The "Impact" of the Journal Impact Factor in the Review, Tenure, and Promotion Process
The Journal Impact Factor has been widely critiqued as a measure of individual academic performance. However, it is unclear whether these criticisms and high profile declarations, such as DORA, have led to significant cultural change.
The Times Higher Education University Impact Rankings assess universities against the United Nations' Sustainable Development Goals. Calibrated indicators are used to provide comprehensive and balanced comparisons across three broad areas: research, outreach, and stewardship. This first edition includes more than 450 universities from 76 countries.
How to Shine in Indonesian Science? Game the System
Indonesia researchers have inflated their Indonesia’s Science and Technology Index (SINTA) score by publishing large numbers of papers in low-quality journals, citing their own work excessively, or forming networks of scientists who cited each other.
Elsevier Acquires Science-Metrix Inc., Provider of Research Analytics Services and Data
Elsevier, the information analytics business specializing in science and health, has acquired Science-Metrix Inc., a research evaluation firm that provides science research evaluation and analytics to assess science and technology activities
Reference Implementation for Open Scientometric Indicators
Within the project "Reference implementation for Open Scientometric Indicators" (ROSI), new assessments and visualizations of conventional and alternative metrics (altmetrics) will be developed and their effect on researchers will be investigated. For this purpose, a reference implementation based on the open source research information system VIVO will be developed in which various metrics are combined with data from different openly licensed sources. In order to develop the requirements of the target groups, surveys are going to be conducted to investigate the effect of scientometric indicators on scientist's and their expectations regarding those indicators. The objectives of the project are firstly to evaluate the scientometric needs and concerns of the target groups, and secondly to implement a usable reference implementation of a toolset that reflects the results of the study and that enables transparent, license-free, flexibly adaptable analysis of the output of researchers, contributors and organisations.
Why (almost) Everything We Know About Citations is Wrong: Evidence from Authors
Although citations and related metrics like the H-index are widely used in academia to evaluate research and allocate resources, the referencing decisions on which they are based are poorly understood. This paper investigates whether authors reference works that influenced them most or those they believe the readers will value most.
An Index to Quantify an Individual's Scientific Leadership
The h-index has gained wide acceptance as a bibliometric indicator of individual scientific achievement. In this paper, J. E. Hirsch proposes an alternative to replacing the h-index with a better index, the h-alpha-index, to address at least some of its deficiencies.