Submission to first decision time

Having written previously about journal acceptance to publication times, it is high time I looked at the other important time that affects publication speed: submission to first decision time. As I explained in the previous post, the time from submission to publication in a peer reviewed journal can be split into three phases, the two discussed previously and here and also the time needed for the authors to revise, which the journal can’t control.

A survey of submission to first decision times

I have trawled through the instructions to authors pages of the journals in the MRC frequently used journal list, which I have used in several previous posts as a handy list of relatively high-impact and well known biomedical journals. I’ve used the list as downloaded in 2012, and there may be new journals added to it now. I’ve omitted the review journals, which leaves 96.

From these pages I have tried to find any indication of the actual or intended speed to first decision for each journal. For many journals, no information was provided on the journal website about average or promised submission to first decision times. For example, no Nature Publishing Group, Lancet, Springer or Oxford University Press journals in this data set provide any information.

However, of these 96 journals 37 did provide usable information. I have put this information in a spreadsheet on my website.

20 promised a first decision within 28 or 30 days of submission. 12 others promised 20-25 days. Of the rest, two are particularly fast, Circulation Research (13 days in 2012) and Cellular Microbiology (14 days); and one is particularly slow, Molecular and Cellular Biology (4 to 6 weeks, though they may just be more cautious in their promises than other journals). JAMA and Genetics are also relatively slow, with 34 and 35 days, respectively. (Note that the links here are to the page that states the time, which is generally the information for authors.)

A few journals promise a particularly fast for selected (‘expedited’) papers but I have only considered the speed promised for all papers here.

I conclude from this analysis that, for relatively high-impact biomedical journals, a first decision within a month of submission is the norm. Anything faster than 3 weeks is fast, and anything slower than 5 weeks is slow.

Newer journals

But what about the newer journals? PeerJ has recently been boasting on its blog about authors who are happy with their fast decision times. The decision times given on this post are 17, 18 and 19 days. These are not necessarily typical of all PeerJ authors, though, and are likely to be biased towards the shorter times, as those whose decisions took longer won’t have tweeted about it and PeerJ won’t have included them in their post.

PLOS One gives no current information on its website about decision times. However, in a comment on a PLOS One blog post in 2009, the then Publisher Pete Binfield stated that “of the 1,520 papers which received a first decision in the second quarter of 2009 (April – June), the mean time from QC completion to first decision was 33.4 days, the median was 30 days and the SD was 18.” He didn’t say how long it took from submission to ‘QC completion’, which is presumably an initial check; I expect this would be only a few days.

Kent Anderson of the Scholarly Kitchen asked last year “Is PLOS ONE Slowing Down?“. This post only looked at the time between the submission and acceptance dates that are displayed on all published papers, and it included no data on decision dates, so the data tell us nothing about decision times. In a series of comments below the post David Solomon of Michigan State University gives more data, which shows that the submission to acceptance time went up only slightly between early 2010 and September 2011.

The star of journals in terms of decision time is undoubtedly Biology Open. It posts the average decision time in the previous month on its front page, and the figure currently given for February 2013 is 8 days. They say they aim to give a first decision within 10 days, and their tweets seem to bear this out: in June 2012 they tweeted that the average decision time in May 2012 had been 6 days, and similarly the time for April 2012 had been 9 days.

Other megajournals vary similarly to ordinary journals. Open Biology reports an average of 24 days, Cell Reports aims for 21 days, and G3 and Scientific Reports aim for 30 days. Springer Plus, the BMC series, the Frontiers journals, BMJ Open and FEBS Open Bio provided no information, though all boast of being fast.

What affects review speed?

If newer journals are faster, why might that be? One possible reason is that as the number of submitted papers goes up, the number of editors doesn’t always go up quickly enough, so the editors get overworked – whereas when a journal is new the number of papers to handle per editor may be lower.

It is important to remember that the speed of review is mainly down to the reviewers, as Andy Farke pointed out in a recent PLOS blog post. Editors can affect this by setting deadlines and chasing late reviewers, but they only have a limited amount of control over when reviewers send their reports.

But given this limitation, there could be reasons for variations in the average speed of review between journals. Reviewers might be excited by the prospect of reviewing for newer journals, so they are more likely to be fast. This could equally be true for the highest impact journals, of course, and also for open access journals if the reviewer is an open access fan. Enthusiastic reviewers not only mean that the reviewers who have agreed send their reports in more quickly, but also that it will be easier to get someone to agree to review in the first place. As Bob O’Hara pointed out in a comment on Andy Farke’s post, “If lots of people decline, you’re not going to have a short review time”.

A logical conclusion from this might be that the best way in which a journal could speed up its time to first decision would be to cultivate enthusiasm for their journal among the pool of potential reviewers. Building a community around the journal, using social media, conferences,  mascots or even free gifts might help. PeerJ seem to be aiming to build such a community with their membership scheme, not to mention their active Twitter presence and their monkey mascot. Biology Open‘s speed might be related to its sponsorship of meetings and its aim to “reduce reviewer fatigue in the community”.

Another less positive possible reason for shorter review times could be that reviewers are not being careful enough. This hypothesis was tested and refuted by the editors of Acta Neuropathologica in a 2008 editorial. (Incidentally, this journal had an average time from submission to first decision of around 17 days between 2005 and 2007, which is pretty fast.) The editorial says “Because in this journal all reviews are rated from 0 (worst) to 100 (best), we plotted speed versus quality. As reflected in Fig. 1, there is no indication that review time is related to the quality of a review.”

Your experience

I would love to find (or even do) some research into the actual submission to first decision times between different journals. Unfortunately that would mean getting the data from each publisher, and it might be difficult to persuade them to release it. (And I don’t have time to do this, alas.) Does anyone know of any research on this?

And have you experienced particularly fast or slow peer review at a particular journal? Are you a journal editor who can tell us about the actual submission to first decision times in your journal? Or do you have other theories for why some journals are quicker than others in this respect?