Doctors and Postdocs in Political Science in Switzerland. A Study Conducted by the Swiss Political Science Association.
This report shows the results of a survey conducted in spring 2019 among all people who received a PhD in political science from a Swiss university during the last eleven years (2008 to 2018) and among postdocs working in a Swiss university in June 2019. Thus, this survey sheds light on the experiences and career paths of both postdocs and doctors in political science who left academia. Moreover, it compares the results regarding postdocs with a similar study carried out in 2012.
Citations Systematically Misrepresent the Quality and Impact of Research Articles: Survey and Experimental Evidence from Thousands of Citers
Citations are ubiquitous in evaluating research, but how exactly they relate to what they are thought to measure is unclear. This article investigates the relationships between citations, quality, and impact using a survey with an embedded experiment.
Many of the words used by scientists when reviewing manuscripts, job candidates and grant applications - words such as incremental, novelty, mechanism, descriptive and impact - have lost their meaning.
Historical Comparison of Gender Inequality in Scientific Careers Across Countries and Disciplines
A study suggests that the productivity and impact of gender differences are explained by different publishing career lengths and dropout rates. This inequality in academic publishing has important consequences for institutions and policy makers.
No Raw Data, No Science: Another Possible Source of the Reproducibility Crisis
Inappropriate practices of science have been suggested as causes of irreproducibility. This editorial proposes that a lack of raw data or data fabrication is another possible cause of irreproducibility.
This paper presents a simple model of the lifecycle of scientific ideas that points to changes in scientist incentives as the cause of scientific stagnation. It explores ways to broaden how scientific productivity is measured and rewarded, involving both academic search engines such as Google Scholar measuring which contributions explore newer ideas and university administrators and funding agencies utilizing these new metrics in research evaluation.
Comparing Meta-analyses and Preregistered Multiple-laboratory Replication Projects
Kvarven, Strømland and Johannesson compare meta-analyses to multiple-laboratory replication projects and find that meta-analyses overestimate effect sizes by a factor of almost three. Commonly used methods of adjusting for publication bias do not substantively improve results.
What Are Fake Interdisciplinary Collaborations and Why Do They Occur?
Scientists influenced by funding priorities promoted by regional, national and transnational funding bodies, as well as by the academic mania for ‘interdisciplinariness’, feel compelled to develop a concrete interdisciplinary research topic and organize their research collaboratively.
The Acceptability of Using a Lottery to Allocate Research Funding: a Survey of Applicants
The Health Research Council of New Zealand is the first major government funding agency to use a lottery to allocate research funding for their Explorer Grant scheme. A recent survey examines how well the measure is accepted.
Are Altmetrics Able to Measure Societal Impact in a Similar Way to Peer Review?
Altmetrics have become an increasingly ubiquitous part of scholarly communication, although the value they indicate is contested. A recent study examined the relationship of peer review, altmetrics, and bibliometric analyses with societal and academic impact. Drawing on evidence from REF2014 submissions, it argues altmetrics may provide evidence for wider non-academic debates, but correlate poorly with peer review assessments of societal impact.
Great strides have been made to encourage researchers to archive data created by research and provide the necessary systems to support their storage. Additionally it is recognised that data are meaningless unless their provenance is preserved, through appropriate meta-data. Alongside this is a pressing need to ensure the quality and archiving of the software that generates data, through simulation, control of experiment or data-collection and that which analyses, modifies and draws value from raw data.
Citation metrics have value because they aim to make scientific assessment a level playing field, but urgent transparency-based changes are necessary to ensure that the data yields an accurate picture. One problematic area is the handling of self-citations.
Atlas of Open Science and Research in Finland 2019 Published
This evaluation of Finnish research organisations, research-funding organisations, academic and cultural institutes abroad and learned societies and academies examines the key indicators chosen to assess the performance on openness. Key indicators are used to provide some insights on the competences and capacity of the research system in supporting progress towards openness. Barriers and development needs are discussed, with suggestions for improvement.