Gordon Johnstone.
decision making of Microsoft or Open AI? We’ve worked with both companies. All lovely people. But are they then going back to Sam Altman or Bill Gates and say- ing, “Hey guys, we should really be thinking about the ethics behind what we’re doing?” As much as I would love for that to be the case, that doesn’t seem likely. This is obvi- ously getting very speculative, but we could be entering a point where the nation state will be eroded. The biggest decisions will be reserved for the corporations that con- trol access to data, or access to resources.” Currently the Scottish AI strategy advocates for any conversations around AI to be viewed through a lens of ethics and human rights. “There is a strong push in Scotland to make AI used in a public sector more transparent. We have the AI register, which is another government programme that logs all the AI tools being used in the Scottish public sector, also the ones that are in development. It details what they are and what they’re used for. A good evolution of that would be a rating system, a red/ amber/green approach to how transparent and how safe these tools are to use. “The public sector is getting particular attention from the Scottish government in terms of AI use. And Scotland is taking a kind of human-rights-first approach to AI through the ethical trustworthy and inclusive, strategy. England is a bit more innovation focussed, regulation lite, to pro- mote growth. It’s as if it wanted to go very, very fast, very quickly, to keep up with the likes of the US and China. And the Scottish government was taking a slightly slower approach, trying to put in guardrails and frameworks that people could use. But he says comments from the Scottish Government suggest that this competitive approach is winning over the Scottish decision-makers.
“It could be argued the Scottish govern- ment is trying to align itself a little bit
Autumn 2025
more with the UK government. Richard Lochhead, minister responsible for AI, recently said at one of our events that we should stop worrying so much about AI and start to embrace its possibilities and it potential. That’s maybe an indication of what’s going to come post March 2026.”
Anthropomorphisation
Alongside these macro political gains made by AI, it is also has traction at a person-to-person level. “The anthropo- morphising of AI has been fascinating to watch – fuelled entirely by the people sell- ing AI tools. If you ask ChatGPT to write you an academic essay it might include references to books and authors that don’t exist. They call that hallucination – a deeply human experience. In effect the marketing has turned AI into a seemingly
intelligent being that has made an under- standable mistake. But it’s not that at all. It’s simply a flawed machine.
I think that we’re going to see a lot more of this anthropomorphisation of language around AI as well-known tools become more integrated.”
“Training is also an inadequate word,” Gordon says, “There will be a constant thirst for information and data to make sure that AI models are as up-to-date and cutting edge and knowledgeable as possible. You can’t do that just by training your data once and letting it go, especially in a world where we’re creating such an unfathomable amount of data every single day. It’s a problem that will intensify rather than dissipate. That’s how capitalism works. People don’t stop earning money when they have enough. It’s the same
INFORMATION PROFESSIONAL 37
Page 1 |
Page 2 |
Page 3 |
Page 4 |
Page 5 |
Page 6 |
Page 7 |
Page 8 |
Page 9 |
Page 10 |
Page 11 |
Page 12 |
Page 13 |
Page 14 |
Page 15 |
Page 16 |
Page 17 |
Page 18 |
Page 19 |
Page 20 |
Page 21 |
Page 22 |
Page 23 |
Page 24 |
Page 25 |
Page 26 |
Page 27 |
Page 28 |
Page 29 |
Page 30 |
Page 31 |
Page 32 |
Page 33 |
Page 34 |
Page 35 |
Page 36 |
Page 37 |
Page 38 |
Page 39 |
Page 40 |
Page 41 |
Page 42 |
Page 43 |
Page 44 |
Page 45 |
Page 46 |
Page 47 |
Page 48 |
Page 49 |
Page 50 |
Page 51 |
Page 52 |
Page 53 |
Page 54 |
Page 55 |
Page 56 |
Page 57 |
Page 58 |
Page 59 |
Page 60 |
Page 61 |
Page 62 |
Page 63 |
Page 64 |
Page 65 |
Page 66 |
Page 67 |
Page 68 |
Page 69 |
Page 70