Rewarding reviewers - sense or sensibility?
Respondents value recognition initiatives related to receiving feedback from the journal over monetary rewards and payment in kind.
Send us a link
Respondents value recognition initiatives related to receiving feedback from the journal over monetary rewards and payment in kind.
Authors tend to attribute manuscript acceptance to their own ability to write quality papers and simultaneously to blame rejections on negative bias in peer review, displaying a self-serving attributional bias.
This is a proposal for a system for evaluation of the quality of scientific papers by open review of the papers through a platform inspired by StackExchange.
Participating in open and signed post-publication peer review may not be so bad for your career after all.
Starting Jan 2016 Nature Communications will publish peer reviews alongside with the paper.
We have little or no evidence that peer review 'works,' but we have lots of evidence of its downside.
Scientific publishers are forging links with an organization that wants scientists to scribble comments over online research papers.
Peer-review is neither reliable, fair, nor a valid basis for predicting 'impact': as quality control, peer-review is not fit for purpose.
Interview with Dr. Michael Lauer on peer review of NIH grant applications and how it can be improved.
Breuning et al. include some tips for avoiding reviewer fatigue:
White paper showing that the vast majority of authors believe that blind peer review helps to minimize discrimination.
Building on the momentum of Peer Review Week 2015, we are excited to announce a partnership with ORCID to extend the credit you get from your Publons verified reviews.
BMC editors show that the quality of peer review is slightly higher in BMC Infectious Diseases that operates open peer review compared to BMC Microbiology operating single-blind peer review.
To me, volunteering your time means forgoing payment for your time. But how is this affected when someone else is cashing in on your time instead?
An initiative to incentivise open research practices through peer review.
Peer review is often claimed to be the guarantor of the trustworthiness of scientific papers, but it is a troubled process. Preprints offer a way out.
Have you recently written a paper, but you're not sure to which journal you should submit it? Or maybe you want to find relevant articles to cite in your paper? Or are you an editor, and do you need to find reviewers for a particular paper? Jane can help!
The involvement of online discussion sites in the identification of errors, anomalies and worse in the published literature continues to demonstrate the usefulness of post-publication review. It also highlights the ambiguous power of anonymity.
PEERE is a project funded by the European Union to explore issues around journal and grant peer review, running from 2014 to 2018.
Simplified processes save time and money that could be reallocated to actual research. Funding agencies should consider streamlining their application processes.
Five reviewers per application represents a practical optimum which avoids large random effects evident when fewer reviewers are used.
"Classical peer review" has been subject to intense criticism for slowing down the publication process, bias against specific categories of paper and author, unreliability, inability to detect errors and fraud, unethical practices, and the lack of recognition for unpaid reviewers. This paper surveys innovative forms of peer review that attempt to address these issues.
"Retrospective analyses of the correlation between percentile scores from peer review and bibliometric indices of the publications resulting from funded grant applications are not valid tests of the predictive validity of peer review at the NIH."
A process at the heart of science is based on faith rather than evidence, says Richard Smith, former editor of the BMJ and chief executive of the BMJ Publishing Group from 1991 to 2004.
Movement to publicly record peer-reviewing activity gains momentum.
eLife has partnered with Publons to help reviewers receive recognition for their work.
This editorial describes the problems with the process of preparing and publishing research findings, and with judging their veracity and significance, and then explains how we at Faculty of 1000 are starting to tackle the ‘deadly sins’ of science publishing.
How do reviewer recommendations influence editor decisions? And are Chinese authors treated fairly?