Content of review 1, reviewed on July 24, 2017

Thanks to all the authors of this article for a very informative review of peer review and an exploration of directions it might take in the future.

There may be too much emphasis on reviewer identity as opposed to the openness of the reviews themselves; in section 2.1 where the three criteria for OPR are listed, consider reversing 1 and 2. The availability of reviews seems of greater importance than reviewer identity in terms of verifying that peer review took place as well as advancing knowledge. Peer review’s small to nonexistent role in promotion and tenure is only briefly mentioned (2.2.1); you could consider how or why this might change in 4.3 (incentives) or 4.4 (challenges).

There is one effort that came to mind that may merit a brief mention in your article- Peer Review Evaluation (PRE) http://www.pre-val.org/, a service of the AAAS. About a year ago the journal Diabetes was using this service- see an example at https://doi.org/10.2337/db16-0236. If you go to this article and click on the green PRE seal, you retrieve a pop-up that details the peer review method, number of rounds of review, and the number of reviewers and editors involved. It’s not clear to me whether PRE is still an active service; current articles from this journal don’t seem to have the seal. In any case this provides some degree of transparency, though short of open peer review.

Although I don’t know if it would be considered an “annotation service”, in Table 2 at the far bottom right you might also add Publons as an example of decoupled post-publication review- I have used it several times for this purpose. As an added benefit, these reviews are picked up by Altmetric.com on a separate “Peer Reviews” tab of their detail display (they also track PubPeer; I’m not sure if Plum Analytics does something similar). See https://www.altmetric.com/details/8959879 for an example. This might be a form of decoupled peer review aggregation worth mentioning in section 2.5.4 (end of the first paragraph).

In section 3.1.1 on Reddit, you might briefly describe what “flair” is.

In section 3.1.2 you suggest ORCID as an example of a social network for academia, but I think it is better to call it an academic profile as you do in section 4.3 (although it wasn’t intended to be that either). I think connecting ORCID and peer reviewing is a good idea, and as you mention, Publons is already doing this.

Also, I have a couple of suggestions that may be better directed to the journal rather than the authors. First, this article and others like it use many non-DOI links to refer to web pages. To ensure the longevity of the evidence base in its articles, journals might begin using (or requiring authors to use) web archiving tools like the Wayback Machine, WebCite, or Perma.cc in order to prevent link rot. For example, see: Jones SM, Van de Sompel H, Shankar H, Klein M, Tobin R, Grover C (2016) Scholarly Context Adrift: Three out of Four URI References Lead to Changed Content. PLoS ONE 11(12): e0167475. https://doi.org/10.1371/journal.pone.0167475. Second, in the references the DOI format should follow CrossRef recommendations to eliminate the “dx” and add HTTPS, so that http://dx.doi.org becomes https://doi.org.

Source

    © 2017 the Reviewer (CC BY 4.0).

References

    P, T. J., M, D. J., Daniel, G., C, J. D., Francois, W., Daniel, M., Yehia, E., Lauren, B. C., K, P. C., Tom, C., Paola, M., Anthony, C., R, B. D., E, N. K., Tony, R., Sara, M., Lillian, R., S, K. D., Bastian, G. T., Josmel, P., Nazeefa, F., Marta, P., Marios, I., Erwin, I. D., Sebastien, R., R, M. C., Lisa, M., Jesper, N. K., Paul, O. D., Cameron, N., Sarah, K., Manojkumar, S., Julien, C. 2017. A multi-disciplinary perspective on emergent and future innovations in peer review. F1000Research.