Send us a link
Open Science Has Spawned a New Wave of Metric-Driven Evaluation
Open Science Has Spawned a New Wave of Metric-Driven Evaluation
Measures intended to encourage openness are clashing with efforts to reform assessment
ELife Won't Get an Impact Factor, Says Clarivate
Clarivate has decided to continue indexing some content from eLife in Web of Science.
Applying Quantified Indicators in Central Asian Science: Can Metrics Improve the Regional Research Performance? - Scientometrics
Applying Quantified Indicators in Central Asian Science: Can Metrics Improve the Regional Research Performance? - Scientometrics
This study discusses the implications of research metrics as applied to the transition countries based on the framework of ten principles of the Leiden Manifesto. They can guide Central Asian policymakers in creating systems for a more objective evaluation of research performance based on globally recognized indicators.
Bibliometrics at Large - The Role of Metrics Beyond Academia
The role of bibliometrics, such as impact factors and h-indices, in shaping research has been well documented. However, what function do these measures have beyond the institutional contexts in which, for better or worse, they were designed?
Research Evaluation Needs to Change with the Times
The focus on a narrow set of metrics leads to a lack of diversity in the types of leader and institution that win funding.
Mapping the Impact of Papers on Various Status Groups in Excellencemapping.net: a New Release of the Excellence Mapping Tool
Mapping the Impact of Papers on Various Status Groups in Excellencemapping.net: a New Release of the Excellence Mapping Tool
In over five years, Bornmann, Stefaner, de Moya Anegon, and Mutz (2014b) and Bornmann, Stefaner, de Moya Anegón, and Mutz (2014c, 2015) have published several releases of the www.excellencemapping.net tool revealing (clusters of) excellent institutions worldwide based on citation data. With the new release, a completely revised tool has been published. It is not only based on citation data (bibliometrics), but also Mendeley data (altmetrics). Thus, the institutional impact measurement of the tool has been expanded by focusing on additional status groups besides researchers such as students and librarians. Furthermore, the visualization of the data has been completely updated by improving the operability for the user and including new features such as institutional profile pages. In this paper, we describe the datasets for the current excellencemapping.net tool and the indicators applied. Furthermore, the underlying statistics for the tool and the use of the web application are explained.
Imaginary Carrot or Effective Fertiliser? A Rejoinder on Funding and Productivity
Imaginary Carrot or Effective Fertiliser? A Rejoinder on Funding and Productivity
The question of whether and to what extent research funding enables researchers to be more productive is a crucial one. In their recent work, Mariethoz et al. (Scientometrics, 2021. https://doi.org/10.1007/s11192-020-03.855-1 ) claim that there is no significant relationship between project-based research funding and bibliometric productivity measures and conclude that this is the result of inappropriate allocation mechanisms. In this rejoinder, we argue that such claims are not supported by the data and analyses reported in the article.
Unpacking The Altmetric Black Box
Article Attention Scores for papers don't seem to add up, leading one to question whether Altmetric data are valid, reliable, and reproducible.
Do Researchers Know What the H-index Is? And How Do They Estimate Its Importance? - Scientometrics
Do Researchers Know What the H-index Is? And How Do They Estimate Its Importance? - Scientometrics
In this article, we pursue two goals, namely the collection of empirical data about researchers' personal estimations of the importance of the h-index for themselves as well as for their academic disciplines, and on the researchers' concrete knowledge on the h-index and the way of its calculation.
Journal Impact Factor Gets a Sibling That Adjusts for Scientific Field
But critics worry the metrics remain prone to misuse.
Introducing a Novelty Indicator for Scientific Research: Validating the Knowledge-Based Combinatorial Approach
Introducing a Novelty Indicator for Scientific Research: Validating the Knowledge-Based Combinatorial Approach
In this study, a novelty indicator to quantify the degree of citation similarity between a focal paper and a pre-existing same-domain paper from various fields in the natural sciences is applied by proposing a new way of identifying papers that fall into the same domain of focal papers using bibliometric data only.
The Matthew Effect Impacts Science and Academic Publishing by Preferentially Amplifying Citations, Metrics and Status
The Matthew Effect Impacts Science and Academic Publishing by Preferentially Amplifying Citations, Metrics and Status
The Matthew Effect, which breeds success from success, may rely on standing on the shoulders of others, citation bias, or the efforts of a collaborative network. Prestige is driven by resource, which in turn feeds prestige, amplifying advantage and rewards, and ultimately skewing recognition.
Scientometric Data and OA Publication Policies of Clinical Allergy and Immunology Journals
Scientometric Data and OA Publication Policies of Clinical Allergy and Immunology Journals
The scientific merit of a paper and its ability to reach broader audiences is essential for scientific impact. Thus, scientific merit measurements are made by scientometric indexes, and journals are increasingly using published papers as open access (OA).
The Official PLOS Blog: A Farewell to ALM, but Not to Article-level Metrics!
PLOS partners with Altmetrics.
ISSI Paper of the Year Award
The International Society for Informetrics and Scientometrics (ISSI) is an international association of scholars and professionals active in the interdisciplinary study science of science, science communication, and science policy.
Quantitative Quality: a Study on How Performance-based Measures May Change the Publication Patterns of Danish Researchers
Quantitative Quality: a Study on How Performance-based Measures May Change the Publication Patterns of Danish Researchers
Nations the world over are increasingly turning to quantitative performance-based metrics to evaluate the quality of research outputs, as these metrics are abundant and provide an easy measure of ranking research. In 2010, the Danish Ministry of Science and Higher Education followed this trend and began portioning out a percentage of the available research funding according to how many research outputs each Danish university produces. Not all research outputs are eligible: only those published in a curated list of academic journals and publishers, the so-called BFI list, are included. The BFI list is ranked, which may create incentives for academic authors to target certain publication outlets or publication types over others. In this study we examine the potential effect these relatively new research evaluation methods have had on the publication patterns of researchers in Denmark. The study finds that publication behaviors in the Natural Sciences & Technology, Social Sciences and Humanities (SSH) have changed, while the Health Sciences appear unaffected. Researchers in Natural Sciences & Technology appear to focus on high impact journals that reap more BFI points. While researchers in SSH have also increased their focus on the impact of the publication outlet, they also appear to have altered their preferred publication types, publishing more journal articles in the Social Sciences and more anthologies in the Humanities.
Bibliometrics in Academic Recruitment: A Screening Tool Rather Than a Game Changer
Bibliometrics in Academic Recruitment: A Screening Tool Rather Than a Game Changer
Paper concludes that metrics were applied chiefly as a screening tool to decrease the number of eligible candidates and not as a replacement for peer review.
Interpreting Bibliometric Data
The article discusses how the interpretation of 'performance' from a presentation using accurate but summary bibliometrics can change when iterative deconstruction and visualization of the same dataset is applied.
The Altmetric Top 100 - 2020
What research caught the public imagination in 2020? Check out Altmetrics' annual list of papers with the most attention.
Google Scholar, Web of Science, and Scopus: A Systematic Comparison of Citations in 252 Subject Categories
Despite Becoming Increasing Institutionalised, There Remains a Lack of Discourse About Research Metrics Among Much of Academia
Despite Becoming Increasing Institutionalised, There Remains a Lack of Discourse About Research Metrics Among Much of Academia
The active use of metrics in everyday research activities suggests academics have accepted them as standards of evaluation, that they are “thinking with indicators”. Yet when asked, many academics profess concern about the limitations of evaluative metrics and the extent of their use.
Boycott the Journal Rankings
Journal rankings are a rigged game. The blacklist of history of economic thought journals isn’t a fluke nor a conspiracy - it exposes how citation rankings really work.
Science, Research and Innovation Performance of the EU
"Science, research and innovation performance of the EU, 2018" (SRIP) analyses Europe’s performance dynamics in science, research and innovation and its drivers, in a global context.
Quantity Does Matter as Citation Impact Increases with Productivity
High-Impact and Transformative Science Metrics: Definition, Exemplification, and Comparison
High-Impact and Transformative Science Metrics: Definition, Exemplification, and Comparison
A novel set of text- and citation-based metrics that can be used to identify high-impact and transformative works. The 11 metrics can be grouped into seven types: Radical-Generative, Radical-Destructive, Risky, Multidisciplinary, Wide Impact, Growing Impact, and Impact (overall).
Elsevier Becomes Newest Customer of Unpaywall Data Feed
Elsevier has become the newest customer of Impactstory's Unpaywall Data Feed, which provides a weekly feed of changes in Unpaywall, our open database of 20 million open access articles.
There Is an Absence of Scientific Authority over Research Assessment as a Professional Practice, Leaving a Gap That Has Been Filled by Database Providers
There Is an Absence of Scientific Authority over Research Assessment as a Professional Practice, Leaving a Gap That Has Been Filled by Database Providers
To what extent does the academic research field of evaluative citation analysis confer legitimacy to research assessment as a professional practice?