EUA Publishes Roadmap on Research Assessment
Supporting the EUA membership with the development of research assessment approaches that focus on research quality, potential and future impact, and that take into account Open Science practices.
publications
Send us a link
Supporting the EUA membership with the development of research assessment approaches that focus on research quality, potential and future impact, and that take into account Open Science practices.
Asking whether Twitter allows scientists to promote their findings primarily to other scientists ("inreach"), or whether it can help them reach broader, non-scientific audiences ("outreach"). Results should encourage scientists to invest in building a social media presence for scientific outreach.
In the 1990s, the Internet offered a horizon from which to imagine what society could become, promising autonomy and self-organization next to redistribution of wealth and collectivized means of production. While the former was in line with the dominant ideology of freedom, the latter ran contrary to the expanding enclosures in capitalist globalization.
Although the Journal Impact Factor (JIF) is widely acknowledged to be a poor indicator of the quality of individual papers, it is used routinely to evaluate research and researchers. Here, we present a simple method for generating the citation distributions that underlie JIFs. Application of this straightforward protocol reveals the full extent of the skew of these distributions and the variation in citations received by published papers that is characteristic of all scientific journals. Although there are differences among journals across the spectrum of JIFs, the citation distributions overlap extensively, demonstrating that the citation performance of individual papers cannot be inferred from the JIF. We propose that this methodology be adopted by all journals as a move to greater transparency, one that should help to refocus attention on individual pieces of work and counter the inappropriate usage of JIFs during the process of research assessment.
Using version control and continuous integration, to create a modern data management system.
Provided that adequate security safeguards were in place, most participants were willing to share their data for a wide range of uses.
Persistent identifiers (PIDs) provide unique keys for people, places, and things, which supports the research process by facilitating search, discovery, recognition, and collaboration. This article reviews the main PIDs used in research (DOIs, ORCIDs, ...), as well as demonstrating how they are being used, and how, in combination, they can increase trust in research and the research infrastructure.
European Commission data and case studies covering access to scientific publications. Bibliometric data as well as well as data on the policies of journals and funders are available.
A growing gap exists between an academic sector with little capacity for collective action and increasing demand for routine performance assessment by research organizations and funding agencies. This gap has been filled by database providers. By selecting and distributing research metrics, these commercial providers have gained a powerful role in defining de-facto standards of research excellence without being challenged by expert authority.
A "significant number" of fraud cases involving research funds and academia have been uncovered in recent years, including professional exchanges which never actually took place, or projects that never came to fruition.
As a major funder of this report, NSF emphasizes its commitment to a more inclusive STEM culture and climate - one free of harassment.
Study says editors of major political science journals demonstrate no systematic bias against female authors. Yet women authors remain underrepresented in the field. Why?
Science Europe recognises the proposed budget increase for the period 2021-2027 but remains disappointed that this increase does not live up to the ambitious scenarios that the European Commission defended in front of the Heads of States and Governments in March 2018.
Detailed recommendations and specific actions for different stakeholders for making FAIR data a reality.
Current bibliometric incentives discourage innovative studies and encourage scientists to shift their research to subjects already actively investigated.
A randomized experiment of NIH R01 grant reviews finds no evidence that White male PIs receive evaluations that are any better than those of PIs from the other social categories.
Shorter deadlines, email reminders, and cash incentives can speed up the peer review process and minimize unintended effects, a recent study suggests. Can it work for other disciplines?
A discussion of how trust in expertise is placed or refused, highlighting the affective dimension of epistemic trust, and discussing the danger of a 'context collapse' in digital communication.
LERU's paper discussing the eight pillars of Open Science identified by the European Commission: the future of scholarly publishing, FAIR data, the European Open Science Cloud, education and skills, rewards and incentives, next-generation metrics, research integrity, and citizen science.
A study identifies papers that stand the test of time. Fewer than two out of every 10,000 scientific papers remain influential in their field decades after publication, finds an analysis of five million articles published between 1980 and 1990.
The role of faculty hiring networks in shaping the spread of ideas in computer science, and the importance of where in the network an idea originates: research from prestigious institutions spreads more quickly and completely than work of similar quality originating from less prestigious institutions.
When citation-based indicators are applied at the institutional or departmental level, rather than at the level of individual papers, surprisingly large correlations with peer review judgments can be observed.
In a controlled experiment with two disjoint program committees, the ACM International Conference on Web Search and Data Mining (WSDM'17) found that reviewers with author information were 1.76x more likely to recommend acceptance of papers from famous authors, and 1.67x more likely to recommend acceptance of papers from top institutions.
Case report looking at two approaches taken by the Central Library of Forschungszentrum Jülich in 2017.
Scientists are more efficient at producing high-quality research when they have more academic freedom, according to a recent study of 18 economically advanced countries. Researchers in the Netherlands are the most efficient of all. The existence of a national evaluation system that is not tied to funding was also associated with efficiency.
Preprint showing that ethnic diversity consistently leads to higher scientific impact.