David Stuart.
require watching multiple time. A more meaningful metric may be the falling number of email enquiries about how to use a service.
There’s lots that can be counted, but
it’s important you don’t try and count everything at once. The key is to not just reach for the nearest or most obvious metric, but reflect on what the services is trying to achieve, and find metrics that align with that. Useful metrics need to be built up gradually.
One of the reasons that discernment may be lacking is that the tools have made it so easy to count and measure things without always requiring the user to give much thought to what is being counted, or whether something else should be count- ed instead. This is not only because data aggregators make so much data available that can be spliced in multiple different ways, but now programming libraries and packages make it simple to engage with a host of content with just a few lines of code. Finally, one of the biggest changes over the last 12 years has been the growth of artificial intelligence. As with many areas of life, the full implications of AI are yet to be felt. What is clear, however, is that it will have a significant impact on what we count and how we count it. How meaning- ful is it to count mentions on social media sites when an increasing proportion of it is generated by AI? To what extent can we trust the classifications of AI when we can’t see inside the black box? To answer these questions we need to return to the funda- mentals of what we are counting and why, rather than just grabbing the numbers that are nearest to hand.
Why is it important to be considering Web Metics in terms of impact?
DS – Impact is important, whether we are talking about the return on investment of the librarian making use of a new web ser- vice, or helping to show the impact of the wider organisation and its outputs. There is always a new web site or service for the librarian to explore and make use of, but unless we consider how we are going to measure its impact, we can waste an awful
28 INFORMATION PROFESSIONAL
lot of time. For example, a lot of people in the library community were hunting around for an X/Twitter alternative a cou- ple of years ago, but without measurement it is impossible to know whether you’ve made the right move, how much time and effort to invest in the new service, or when to give it up.
If you are using web metrics to demon- strate the impact of an organisation’s outputs it can have important consequenc- es for understanding whether the company is heading in the right direction or whether an individual should get a promotion. Is that white paper really driving interest in the company’s products or is the sales increase mere coincidence? In the UK web metrics can have an even more direct financial impact, if, for example, it is used to help demonstrate the impact of research in the REF and it makes the 3* and 4* research.
What should we be thinking about when we start to consider web met- rics for library services? How do you decide what’s important and where to start to ensure we are capturing the right metrics for each service?
DS – It is important to identify metrics that align with the purpose of service and also identify meaningful comparators. If you have produced a video to help users with a new piece of software, then it would be foolish to compare the number of views it receives with the number of views received by the latest Taylor Swift video. The number of views may not even be the most important metric, after all, an ill-thought-out video might
Once we have decided on the met- rics that will be useful, how do we go about measuring them – what does Web Metrics tell us about the tools we should/could be using?
DS – Web Metrics for Library and Inform and Information Professionals introduces a wide range of tools and methodologies that could be useful to the librarian in their work, but it is rarely the case that they should be using one metric or method rather than another. Every library is different, and librarians will need to decide how web metrics can contribute to their work and in their sit- uation. It is more a combined guidebook and toolbox than a manual.
The aim is to measure impact, so what can be done once we have that information – e.g. how do we strengthen services, how do we advocate for services, should we be changing/cutting things that are not delivering impact?
DS – Metrics have an important role in both improving services and advo- cating for them. With web metrics you gain a more objective sense of how well something is doing and if changes are necessary. Managers are also more likely to listen to arguments built on numbers rather than vague impressions about how well a service is doing. With limited resources increasingly robust cases need to be made for extra resources. It’s essential that any decisions and arguments are built on the right web metrics, however, and that these metrics haven’t been pursued at the expense of service itself. Reaching for simple met-
April-May 2024
Page 1 |
Page 2 |
Page 3 |
Page 4 |
Page 5 |
Page 6 |
Page 7 |
Page 8 |
Page 9 |
Page 10 |
Page 11 |
Page 12 |
Page 13 |
Page 14 |
Page 15 |
Page 16 |
Page 17 |
Page 18 |
Page 19 |
Page 20 |
Page 21 |
Page 22 |
Page 23 |
Page 24 |
Page 25 |
Page 26 |
Page 27 |
Page 28 |
Page 29 |
Page 30 |
Page 31 |
Page 32 |
Page 33 |
Page 34 |
Page 35 |
Page 36 |
Page 37 |
Page 38 |
Page 39 |
Page 40 |
Page 41 |
Page 42 |
Page 43 |
Page 44 |
Page 45 |
Page 46 |
Page 47 |
Page 48 |
Page 49 |
Page 50 |
Page 51 |
Page 52 |
Page 53 |
Page 54 |
Page 55 |
Page 56