search.noResults

search.searching

dataCollection.invalidEmail
note.createNoteMessage

search.noResults

search.searching

orderForm.title

orderForm.productCode
orderForm.description
orderForm.quantity
orderForm.itemPrice
orderForm.price
orderForm.totalPrice
orderForm.deliveryDetails.billingAddress
orderForm.deliveryDetails.deliveryAddress
orderForm.noItems
Feature g


version of peer review – crowd review – that could well reduce reviewer fatigue. Put simply, a ‘crowd’ comprising between 30 to 100 researchers is recruited following recommendations from editors. The crowd receives an email invitation and link to the manuscript, and reviews can be discussed among all researchers involved. The researchers typically have around 72 hours to respond, while results of the discussion can either replace or complement traditional peer review. Importantly, the process can take as little as a week, while traditional peer review can take months. List first trialled the system on Synlett


manuscripts in early 2017, with the papers quickly receiving comments that editors considered informative. Fast forward to today and 147 papers have been crowd reviewed from Synlett, authors are responding well and an enthusiastic crowd of reviewers has been maintained. ‘I am more interested in quality than


transparency, [so] our intention has been to improve the quality of peer review and ultimately published papers, and indeed, the speed and accuracy of crowd review are better than in traditional peer review,’ says List. ‘The response to crowd review has been faster, some details get more attention and the comments have been more to the point,’ he adds. ‘Reviewers critically discuss details, provide substantive comments, and some give an overall recommendation with the editorial office making the ultimate decision.’ Given the success of crowd review on


Synlett, its publisher Thieme is extending the process to other chemistry journals, as well as medical research journals. What’s more, several other chemistry and interdisciplinary journals are also evaluating crowd review and related approaches, while a major German funding organisation is implementing crowd review to evaluate its research proposals. But while crowd review has run


smoothly on Synlett, with its strong research community, does the process actually have mass appeal and can it be successfully extended to broader journals with a more varied research community? According to List, yes. ‘Probably it


would be more appropriate to create sub- crowds here, to more accurately evaluate manuscripts from different disciplines,’ he says. ‘But I certainly think so.’ Publons’ Preston is eagerly watching the


development of crowd review, and says: “If it works at scale, it will be awesome.” But in the meantime, he highlights how building tools can also help editors more quickly search and find peer reviewers beyond their typical reviewer networks, thereby reducing peer reviewer fatigue and increasing all-important diversity. One key example is Publons Reviewer Connect, designed to help scholarly journal editors find, screen and connect with the subject matter experts needed to peer review manuscript submissions. ‘This was only really made possible


by combining Web of Science data with Publons Reviewer data, and allows you to properly screen a reviewer and


“Preston also highlights how his organisation is now looking to augment the measurement of quality”


contact them directly,’ he says. ‘It is really exciting and with it, we have [access to] researchers from around the world... And very soon users will be able to select researchers from specific regions.’ Preston also reckons that Publons


Academy, which provides online tools to train researchers in the core competencies of peer review, could also tap into more researchers from emerging economies at earlier stages of their careers.


And, crucially, he also hopes that Publons Academy, will, at some point, be integrated with Reviewer Connect, so emerging-economy researchers can more easily show their expertise to editors in established geographical regions. ‘[The Clarivate acquisition] has allowed


us to expand our ambitions and we now have a reach that we just didn’t have before,’ says Preston. ‘For example, we can now go to China and get more researchers involved with us... so this and continued development really helps.’


8 Research Information February/March 2019 Kristen Marchetti


Quality counts But as organisations continue to push for more diversity and reduce reviewer fatigue, measuring the quality of peer review remains a priority. Right now, the length of a review is typically used to assess quality, but few would argue that the scholarly community needs a more robust measure for this. Preston agrees. In recent years, Publons has focused on measuring the quality of peer review, with, for example, editors being asked to rate reviews on the basis of helpfulness and timeliness. ‘We continue to encourage editors and, potentially, authors to evaluate the quality of reviews but this is easier said than done,’ he says. ‘It has been difficult to get engagement but we will continue to experiment with this.’ Indeed, as Marchetti highlights, SAGE


editors have the facility to rate reviews on ScholarOne. But while these reviews can then be used by other editors to source reviewers, she believes that the subjectivity of this process has hindered take-up. ‘I do think that transparent review can help to improve review quality here though,’ she points out. ‘And we also use plagiarism detection software, which also helps with the quality of peer review.’ Preston also highlights how his


organisation is now looking to augment the measurement of quality with tools such as sentiment analysis, which uses natural language processing to extract positive, neutral or negative comments from reviews. ‘We already have the partnerships


in place and will be doing research in Publons’ reviews to see if we can pull out interesting results, in terms of sentiment analysis, and directly measure review quality,’ he says. ‘We’ve already done a ton of


development on review transparency and in the next 12 months will be really focusing on making this all work at scale, in a way that benefits everyone.’


@researchinfo | www.researchinformation.info


Maradon 333/Shutterstock.com


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36