us whether we like it or not. Shouldn’t we have the right to decide whether we want to revert to “ordinary, vanilla”. Shouldn’t we be able to decide whether the thing on the other end of a help line is human or artificial?
“As an information scientist, I was taught to look at multiple sources and to verify those sources. It seems to me that while AI search engines and chat bots may have at their robotic fingertips vastly more data – far wider and faster than I could ever hope to do – there is (in my limited experience) no, or little, information provided about sources. I know that there is weighting built into their algorithms – a sort of sequential word probability at the lowest level, but I do not know whether that weighting extends to analysing sources. Nor do I know (if it is in fact built-in) on what that weighting is based. LLMs are not search engines nor do they present facts. They present word sentences built on the probability that the words work together, and the public think they are getting information!” By using an AI chatbot or LLM to do your research, you are crossing your fingers and ceding the responsibility for information and data integrity to an algorithmic stranger. What will become of facts?
“For now, while I am still able to choose, I shall use search engines (and I know these all have some ‘intelligence’ built in) that allow me to assess the degree to which I can trust the answer.” Within this context, UKeiG is delighted to launch a new half-day online CPD AI course Generative AI and Retrieval Augmented Generation for
Rewired 2025
librarians, information and knowledge professionals.
Generative AI chatbots are changing search expectations. Since the launch in 2022 of OpenAI’s Large Language Model (LLM) chatbot ChatGPT user information behaviour has begun to change with significant implications for the library, information and knowledge profession. Generative AI has made finding and using information easier, but its reliability is questionable. There are many ethical issues around its training and its wider social and environmental impact. Retrieval Augmented Generation (RAG) offers to partly address these issues by enabling the integration of AI technology with a responsibly curated collection of relevant content to increase the reliability of its outputs. Our course will introduce you to the potential of RAG, as well as to definitions of AI literacy that are central to promoting responsible use. This course is ideal for any library, information and knowledge professional interested in developing an understanding of Generative AI as it applies to their work. It complements UKeiG’s course Artificial intelligence for librarians, information and knowledge professionals and you’ll gain an understanding of the nature and impact of Generative AI, AI literacy, Large Language Models (LLMs) and how to enhance and augment them by creating a customised knowledge base and training a Large Language Model using a scholarly dataset.
The course introduces you to a brief history of Generative AI, how LLM’s work, generative AI and information literacy and building your own RAG system.
INFORMATION PROFESSIONAL DIGITAL 21
It will enable a lively and interactive environment in which participants can share their views and experiences and shape the direction of the discussion. It will also include a deep dive into the technology that is understandable to the non-coding professional and actionable for those with more technical skills. Looking forward to seeing you there. Dates and times will follow shortly. If you’re a CILIP member but haven’t yet joined the UKeiG, you can update your preferences on the CILIP website. IP
Links
www.cilip.org.uk/ukeig https://curatedlines.online/ Email Chris at:
lisqual@cix.co.uk
Page 1 |
Page 2 |
Page 3 |
Page 4 |
Page 5 |
Page 6 |
Page 7 |
Page 8 |
Page 9 |
Page 10 |
Page 11 |
Page 12 |
Page 13 |
Page 14 |
Page 15 |
Page 16 |
Page 17 |
Page 18 |
Page 19 |
Page 20 |
Page 21 |
Page 22 |
Page 23 |
Page 24 |
Page 25 |
Page 26 |
Page 27 |
Page 28 |
Page 29 |
Page 30 |
Page 31 |
Page 32 |
Page 33 |
Page 34 |
Page 35 |
Page 36 |
Page 37 |
Page 38 |
Page 39 |
Page 40 |
Page 41 |
Page 42 |
Page 43 |
Page 44 |
Page 45 |
Page 46 |
Page 47 |
Page 48