Go forth and replicate!
To make replication studies more useful, researchers must make more of them, funders must encourage them and journals must publish them.
To make replication studies more useful, researchers must make more of them, funders must encourage them and journals must publish them.
The replication crisis in science is largely attributable to a mismatch in our expectations of how often findings should replicate and how difficult it is to actually discover true findings in certain fields.
There is no perfect metric. There is no number or score which fully encapsulates the value, impact, or importance of a piece of research. While this statement might appear obvious, research evaluation and measurement are a fact of life for the scientific research community.
In an editorial in the 26 August issue of the journal Science, Jeremy Berg, the journal's 20th editor-in-chief, examines the importance of funding science steadily, with predictable budget cycles that allow science-funding agencies to do long-term planning that research projects typically require.
LinkedIn co-founder, a Nobel laureate and more than 10 university presidents among high-profile speakers at Times Higher Education’s flagship event.
The new director of the federal office that guards against misconduct in U.S.-funded biomedical research is aiming to shake things up—but is also encountering rough waters. Kathryn Partin, who took the helm of the Office of Research Integrity (ORI) in December 2015, has launched a top-to-bottom review of the office, which has been criticized for moving too slowly and meting out sanctions that lack teeth.
A time traveler from 1915 arriving in 1965 would have been astonished by the scientific theories and engineering technologies invented during that half century. One can only speculate, but it seems likely that few of the major advances that emerged during those 50 years were even remotely foreseeable in 1915.
Cold Spring Harbor Laboratory's free, not-for-profit preprint service bioRxiv has received generous additional financial support.
Science has become a lot bigger and faster. Join us now to make it better @ the congress on 26 / 27 january 2017, Berne.
Peer review is a thankless task, but journals have been experimenting with accolades and cash incentives for scientists who serve as peer reviewers.
Metrics derived from Twitter and other social media are increasingly used to estimate the broader social impacts of scholarship. Such efforts, however, may produce highly misleading results, as the entities that participate in conversations about science on these platforms are largely unknown.
Research creates its own problems. Articles may be withdrawn because of irregularities, results can be impossible to reproduce, methods are often non-standardised, and publications may not be accessible. The search is now on for solutions.
Many people see privately funded research as a threat to academic independence, but this is an incomplete view. Experts with close connections to politics and business are a logical consequence of a knowledge-based society. It is time for a fundamental debate on responsible research partnerships.
A few hours ago, 50 months after Elsevier submitted a patent application for an “Online peer review system and method” the patent was awarded to the company.
A contribution to the open innovation, open science, open to the world agenda 2016.
One of the pioneers in developing fluorescent proteins for biological studies was 64 years old.
In France, the final text of a new law on Open Access has been adopted on June 29, 2016.
The giant journal company said it was merely protecting its own proprietary system. But a wave of critics on social media said they were suspicious of its motives.
In this Perspective, Thomas C. Südhof describes some of the current challenges to the peer review system that have endangered public acceptance of science and discusses possible avenues to addressing these challenges.
How do we ensure the effective role of science in public policy-making? This well-worn, long-standing question reflects the fact that the answer is not simple. Later this month in Brussels, scientists and policy-makers will convene at the International Network for Government Science Advice (INGSA) Forum to consider the most promising ways forward.
Peer review is widely viewed as an essential step for ensuring scientific quality of a work and is a cornerstone of scholarly publishing. In this work we investigate the feasibility of a tool capable of generating fake reviews for a given scientific paper automatically.
This week, the first of 1500 researchers and support staff begin moving into the largest biomedical research building in Europe, the £650 million Francis Crick Institute in London.
A dataset that is the result of content mining 167,318 published psychology articles for statistical test results.
The web was built specifically to share research papers amongst scientists. Despite this being the first goal of the modern web, most research is still published behind a paywall. We have recently highlighted famous math papers that reside behind a pa