This page contains a Flash digital edition of a book.
ANALYSIS AND NEWS WHAT DOES IMPACT REALLY MEAN?


Digital Science recently hosted a ‘Research Impact Spotlight’ event, the second in a series following on from its Open Data Spotlight in March, writes Jonathan Adams


T


he aim of the Digital Science Spotlight series is to ‘shine a light’ on key topics in scholarly communication, exploring a variety of perspectives on the issues and


providing an opportunity for discussion among the research community. June’s Spotlight highlighted a seemingly simple question about an increasingly debated topic: what is ‘impact’? Greater expectations of research, increased competition for funding, the UK’s 2014 Research Excellence Framework, more non-traditional outputs, new data tools and social media metrics all put ‘impact’ at the heart of scholarly communication and require rigorous examination.


Daniel Hook, our managing director- designate, welcomed the audience before handing over to myself to introduce the panel and ‘set the scene’ regarding the current impact landscape, including the 2014 REF impact case studies database and website developed by Digital Science.


Our first speaker, Ben Goldacre – doctor, academic, campaigner, writer – spoke (at great speed!) about his work to ensure policy decisions are informed by research findings. Too often this still doesn’t happen. Ben used the AllTrials campaign as an example of stakeholders, including funders and publishers, coming together to create real policy impact where decades of academic papers on issues around clinical trials have failed to drive improvement or even convert established sceptics. Impact assessment is still skewed too heavily towards conventional research outputs and not enough to other routes by which people deliver real benefit. How to optimise impact? Ben drew on his own experiences to provide a hit-list of ways that funders, employers and individuals could incentivise, facilitate and deliver research engagement and outreach. Employers can better recognise the efforts of researchers, allocate funding for them, provide relevant training, and offer supportive and flexible employment structures. And, to maximise their own impact, researchers need to be practical: sustainability, effectiveness and a portfolio-style career.


8 Research Information AUGUST/SEPTEMBER 2015 Liz Allen, head of evaluation at The


Wellcome Trust, gave a funder’s perspective. Funders need to evaluate research impact to make more informed funding decisions. She raised concern that the discourse around impact had unduly focussed on new tools and technologies. To get the most out of these tools, we better understanding of where they measure what counts or only count what can be measured.


There are opportunities to be smarter at maximising the impact of funding, around


‘Impact assessment is still skewed too heavily towards conventional research outputs’


openness, discoverability, interconnectivity and interoperability. Liz echoed Ben’s suggestion that we look beyond academic journals for a true picture. For example, a major flaw in article- based analysis is uninformative weighting by author order. The Wellcome Trust, along with Digital Science and others, is developing a new taxonomy of contributor roles – Project CRediT


– to create a trackable system that describes the real contributions to a research report. The final speaker was Euan Adie, founder of Altmetric. For Euan, impact means research that makes a difference: a social, economic or cultural benefit, an influence on practice, or changing the way people think about a problem. He noted the interplay between quality, attention and impact. Researchers should care about demonstrating impact: when research is funded by taxpayers, not caring could be seen as selfish. Citations are only part of the picture and only relate to scholarly attention. Citation analysis does not bridge the ‘Evaluation Gap’ between methods and objectives because policy documents can be just as impactful but lack citation data. Altmetrics can help because they deal with broader, non-academic attention and can track policy documents and attention on social media. However ‘altmetrics’ is a misleading name. They are not alternative but complementary, and not measurements of impact but indicators of attention.


The evening fielded a range of perspectives and insights and contrasted nicely with recent ‘establishment’ discussion on formal impact methods. We welcome any feedback from those who attended and we look forward to the next event in our Spotlight series. Follow us on Twitter, Facebook, LinkedIn or Google+ to get updates on all our events.


Jonathan Adams joined Digital Science as chief scientist in October 2013


The panel espoused a range of persectives


@researchinfo www.researchinformation.info


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40