search.noResults

search.searching

dataCollection.invalidEmail
note.createNoteMessage

search.noResults

search.searching

orderForm.title

orderForm.productCode
orderForm.description
orderForm.quantity
orderForm.itemPrice
orderForm.price
orderForm.totalPrice
orderForm.deliveryDetails.billingAddress
orderForm.deliveryDetails.deliveryAddress
orderForm.noItems
Feature


Clearly, the latest move from China will


have a profound effect on the the nation’s academics. The new requirements could well to lead to a decrease in international publications with some universities falling in global higher education rankings. But the latest move will also help the


government to realise its desire to have the nation develop its own academic standards while stepping away from the over-use of single-point metrics. And importantly for many, it aligns well with global movements, such as the San Francisco Declaration on Research Assessment (DORA) and Leiden Manifesto, that aim to move away from single-point metrics to broader measures of research performance. Indeed, for Martin Szomszor, director of the Institute for Scientific Information and head of research analytics at Clarivate, the latest move from China represents success. ‘This is a clear move away from using single-point metrics to evaluate institutions and people,’ he says. ‘We’ve been engaged with various bodies in China over the last couple of years and have watched them evolve their thinking very rapidly towards something


www.researchinformation.info | @researchinfo


“Clearly, the latest move from China will have a


profound effect on the the nation’s academics”


that is more in line with Europe and North American research evaluation.’ Early last year, Szomszor and colleagues


from ISI released the report ‘Profiles, not metrics’ that highlighted the critical detail that is lost when data on researchers and institutions are distilled into a simplified of single-point metric or league table. The report set out alternatives to academia’s well-used Journal Impact Factor, h-index and average citation. For example, it illustrated how an impact profile, which shows the real spread of citations, could be used to demonstrate an institution’s performance instead of an isolated Average Citation Impact. ‘[The report] has become a really useful tool, particularly around the customer-


facing part of the business,’ says Szomszor. ‘In the last few years, the search for other types of metrics and indicators has been growing steadily... and what is happening in China now is very positive.’ Daniel Hook, chief executive of Digital


Science, has also been eyeing China’s move away from a single-point metrics- focused evaluation system with interest. ‘We are seeing unsettled times for metrics in China,’ he says. ‘The government has effectively [asked] each institution to locally define the metrics that are important to it, and that it would like to work on, and so create a new norm for China from the ground up.’ Like many, Hook is not a fan of single- point metrics and ranking. His company invested in non-traditional bibliometric company, Altmetric, as early as 2012, and introduced its Dimensions database in 2017. The research database links many types of data including Altmetric data, awarded grants, patents, and more recently datasets, with a view to moving research evaluation practices beyond basic indicators. Digital Science also joined DORA in 2018. ‘I have given public talks where I’ve said


June/July 2020 Research Information


g 5


tetiana_u/Shutterstock.com


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36