Launching a new venture with a debate on peer review

The big day has arrived: this evening about 60 people will gather in Kings Cross, London, to launch my new company, Cofactor. Hopefully lots more will follow along online using the hashtag #PeerRevFactors, because this will not just be a launch, it will also be an evening of short talks and discussion about peer review. The theme is ‘What difference will changes in peer review make to authors and journals?’ and we have four great speakers:

I will also give a brief introduction to Cofactor and to the theme, and after the talks the audience of science, publishing and communications people will join in to discuss what they’ve heard. The talks will start at about 19:00 BST. I will post a summary of the event here afterwards, including a Storify of the tweets.

What is Cofactor?

Regular readers of this blog will know that I know quite a lot about journals. For a while I have been looking for the best way in which I can use this knowledge to help researchers. The solution is a company offering editorial help, consultancy and workshops to researchers. It consists of me and a growing team of freelance editors and editorial consultants covering a wide area of science.

Having these expert editors to call on means that Cofactor can check and improve many more research papers than I could on my own. At the same time, clients still benefit from my expertise on every paper, as I check all the editing done by my freelancers. My time will also (hopefully!) be freed up to offer more specialised consultancy and to give workshops to groups of researchers.

I also hope to be able to get involved in more projects around scientific publishing, open science and so on. The most popular posts on this blog by far have been the surveys of journals with respect to their speed (of review and publication), impact metrics and charges (for open access and other things), so I will be doing updated surveys on these and other features of journals before long. One project that is already under way is an innovative Journal Selector tool, which will help researchers to choose a journal based on these kinds of factors.

Cofactor is offering several kinds of help with scientific papers: substantive editing, a quick check called the Cofactor Summary and an abstract check. We can also help researchers choose a journal, negotiate the peer review process or decide on a publishing strategy. And our workshops can help junior or more experienced researchers to understand the big changes in scientific publishing and how these affect them.

Do get in touch for help with publishing your papers, to book a workshop, or to talk about working for Cofactor.

What difference will changes in peer review make?

So, tonight’s theme is new forms of peer review and what difference they are making already and will make in the future.

What kinds of peer review are we talking about?

  • Open peer review
  • Post-publication peer review
  • Peer review that is independent of journals
  • Crowdsourced peer review
  • Innovative review processes involving discussion between reviewers and authors

Another relatively new kind of peer review is that practised by journals such as PLOS ONE and PeerJ (‘megajournals‘), in which reviewers are asked to comment only on whether the science is sound and not whether the conclusions are interesting or significant.

My take is that anyone who writes scientific papers should start rethinking how they do this in the light of these changes. If your paper is reviewed in the open, everyone will be able to see the comments of the reviewers, and often the original submitted version too. So you can’t rely on reviewers or journal staff to quietly correct any errors. Unless you ensure errors are corrected before submission, they will be publicly visible when the paper is published.

If you think you can escape this public scrutiny by avoiding journals that have open review, think again. Services such as PubMed Commons and PubPeer are gaining in popularity, and papers that are seen to have major problems are being discussed at length in these and other forums.

So your best defence against criticism of your paper online is to ensure that it has no major errors when you first submit it to a journal. And the best way to do that is to get it checked by a professional editor before submission, someone with experience of editing presubmission journal papers and who knows the kinds of errors to look out for. And guess what: Cofactor has editors like this ready and waiting to check your paper!

Let’s get talking

So please do join the discussion today using the hashtag #PeerRevFactors, or in the comments here, and tell us what you think the effect of these new kinds of peer review will be. Have you commented on someone else’s paper or written a published review (I have)? Have you experienced open review or had comments on your paper after publication, and how did you feel about that? Have you changed the way you prepare your papers?

May highlights in scientific publishing

News gleaned from Twitter in May: debates about replication and data sharing, articles about peer review and more.

Replication

The debate about replication in science has been fired up by a special issue of the journal Social Psychology consisting entirely of replications (explained here by editor Chris Chambers, @Chrisdc77). One author of a study that was chosen for a replication attempt wrote about her difficulties with the experience. A lot of discussion later, I particularly liked Rolf Zwaan‘s attempt to summarise both sides of the debate. He contrasts the view of ‘replicators’ that original research is a public good with that of ‘replication critics’ who seem to view it as a work of art.

A related debate concerns what happens when questions are raised about a paper and how the authors should react. Palaeontologist @JohnHutchinson posted a long and thoughtful consideration of this based on his experience with a 2011 paper on the growth rates of Tyrannosaurus, which led to a correction. He says that going over all the data again takes a huge amount of time and energy, but the process is what science is meant to be about. (via @Protohedgehog)

The attempts to replicate the STAP stem cell experiments (as covered here in March) seem to be drawing to a head, and open access and open peer review have helped to resolve the issue. F1000 Research published a non-replication by @ProfessorKenLee that contained the full dataset, and the paper was then made available for open review. A couple of weeks later it had two positive peer reviews, which means that it is now indexed in PubMed. All authors of the original STAP study have now agreed to retract it.

Data sharing

The polar bear genome was published in Cell after the dataset was released by @Gigascience. This is a step forward for open data, as Cell Press have previously said they would see the publication of data with a DOI as potential prior publication that might preclude publication of a paper on that data. (via @GrantDenkinson)

Dorothy Bishop (@deevybee) posted about her first experience of sharing data, describing it as exciting but scary. She discovered some errors in the process, and says “The best way to flush out … errors is to make the data public.”

In PLOS, Theo Bloom and Jennifer Lin summarised how the publisher’s new data sharing policy has gone down with authors. The short answer is ‘very well’, but there are still concerns, which the post lists and responds to.

In the mean time, the European Medicines Agency (EMA) has announced (see p8 of the linked pdf) that clinical trial data will be made available, but researchers and other interested parties will only be allowed to view the data on screen. Unbelievably, they will not be allowed to download it, print it, or do anything else but look at it. The German Institute for Quality and Efficiency in Health Care (@iqwig) published some reactions of researchers to this, which are well worth looking at. (via @trished)

 Peer review

 I was alerted by editor Carlotta Shearson (@CShearson) to an editorial in Journal of Physical Chemistry Letters entitled ‘Overcoming the Myths of the Review Process and Getting Your Paper Ready for Publication’. The process it describes is similar to what I’ve seen in many selective journals, so it will be useful to authors in many fields as well as physical chemistry. It also includes a table of the ‘Top Ten Unproductive Author Responses’ to reviewer comments.

Another journal editorial of interest was published in Administrative Science Quarterly, entitled ‘Why Do We Still Have Journals?’ This focuses more on social science and concludes that, for now, journals are still indispensable. (via @SciPubLab)

Miscellaneous news

A survey of Canadian journal authors was discussed by Phil Davis (@ScholarlyChickn) in the Scholarly Kitchen. Peer review, journal reputation, and fast publication were the top three factors cited in deciding where to submit their manuscripts, above open access, article-level metrics and mobile access. (via @MikeTaylor)

Following the Freedom of Information requests by Tim Gowers on Elsevier subscription pricing covered last month, Australian mathematician Scott Morrison has found out a bit about pricing and contracts for Australian universities. This may lead to FOI requests there. In the mean time, Gowers has posted updates on four more UK universities. (via @yvonnenobis)

And I will be announcing my own news very soon (though you might already have heard about my new company on Twitter). Watch out for the next post!