search.noResults

search.searching

note.createNoteMessage

search.noResults

search.searching

orderForm.title

orderForm.productCode
orderForm.description
orderForm.quantity
orderForm.itemPrice
orderForm.price
orderForm.totalPrice
orderForm.deliveryDetails.billingAddress
orderForm.deliveryDetails.deliveryAddress
orderForm.noItems
Research Information:FOCUS ON NORTH AMERICA


The White House 2013 OSTP memo called for expanding public access to research and research data


a 2013 article in Science and Engineer Ethics, Michael Davis and Kelly Laas discuss the differences between the European concept of RRI and the criteria of broader impacts (BI) applied by the National Institutes of Health (NIH) and National Science Foundation (NSF) in the US. BI is a much less fundamental shift in evaluation criteria and focuses on more of the peripheral aspects of research, like the inclusion of disadvantaged groups, and public education and outreach. The difference between the two approaches may seem esoteric but they may eventually have implications to the kinds of tools and forms of communication that researchers will need. So far, the shifts in evaluation criteria are relatively early stage. While we’re seeing distinct requirements between regions for institutional level tools like Current Research Information Systems (CRISs), new researcher level technologies like altmetrics and data repositories remain unaffected, with essentially the same requirements globally; that of greater transparency and a need to measure the broader impact of work. My personal hope is that we begin to see convergence


and mandate researcher behaviour. In this way, it has introduced new and different inhomogeneities in the global marketplace. Increasingly, funders and governments are asking researchers to make their work more transparent and are demanding greater accountability. The Research Excellence Framework in the UK is perhaps the most well known example, but it’s not the only one. For example, in the Netherlands there is the Standard Evaluation Protocol (SEP) and Australia has the Excellence in Research for Australia (ERA) framework. These frameworks affect the incentive structures for academics and by extension, their needs. Meanwhile, research funded through European Framework Programs, known as Horizon 2020, are subject to the Responsible Research and Innovation criteria (RRI), which takes account of the societal and environmental effects of research, challenging researchers to rethink how they work to maximise its positive effects on society. In some respects, US funders are lagging behind their European and Australian counterparts on this issue, although it’s important to recognise that US funding agencies are undergoing their own change. In


www.researchinformation.info @researchinfo


‘In the American market, a university is more likely to be both a customer and a competitor’


between US, European, Australian and other governmental funding policies towards agreed best practices for evaluation and assessment. Data sharing is another field where US policy makers might appear to be slightly behind European and particularly British funders, but again, the truth is slightly more nuanced. In January 2015, the Figshare team, here at Digital Science, conducted a survey of open data mandates as listed in the Jisc and RLUK-funded Sherpa Juliet site. They found that, at the time, there were 34 funders that required data archiving – with 16 encouraging it. The numbers will undoubtedly be higher now because there have been a steady stream of new mandates. The UK was leading the way with 44 per cent of the data archiving mandates listed in Sherpa Juliet. Notably, EPSRCs long-awaited open research data policy promised to investigate non-compliance...’ and ‘...impose appropriate sanctions’. That is to say, they are beginning to give the mandates teeth.


Having said that, the US is catching up fast.


Partly, this is due to the White House 2013 OSTP memo, that called for expanding public access to research and research data, but it’s partly due to the long-standing tradition of scholarly communication innovation that goes on in American research institutions. Going back as far as the 1960s, librarians at the Stanford Linear Accelerator Center (SLAC) produced and distributed lists of pre-prints in high energy physics. This paper equivalent of what we now call green open access was in many respects the fore-runner of arXiv, and shaped the way that physicists communicate their research today. Library innovation in the US doesn’t stop there. Digital Commons is a product of Bepress from Berkley, D-Space was a result of a collaboration between MIT and HP Labs, and of course Hydra is a multi-centre project between multiple institutions, almost all of which are in the US. Meanwhile, the Library Publishing Coalition, founded in 2014, aims to help libraries conduct publishing operations in a self-sustaining and community oriented way.


While European institutions and academic libraries are as active as any other in terms of driving forward the conversation around modernising scholarly communication, and supporting open science, US institutions are more heavily involved in spinning out companies and foundations that compete in the publishing space.


The scholarly communication landscape is changing rapidly. It’s probably fair to say that the US and European markets have more in common than they are different, particularly if you compare them to emerging markets like China and India. There are some differences, however, in the way that researchers, our ultimate customers, are incentivised. These differences currently affect institutional offerings but may in the future affect how we develop products for researchers. There are also differences in the way that the market responds when it perceives itself to be under pressure. It seems like a terrible cliché to refer to the entrepreneurial spirit of American culture but to me, the track record of US institutions in this area is striking. In other words, in the American market, a university is more likely to be both a customer and a competitor, particularly if it feels that it can affect change by doing so.


Phill Jones is head of publisher outreach at Digital Science. A referenced version of this article will appear at www.researchinformation.info


JUNE/JULY 2016 Research Information 11


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40