copyright
Pay from AI is only fair
If your work is used in big tech, you should be paid. Katharine Quarmby on a scheme to ensure this
A
ndrew Wiard’s article on taking an ethical approach to generative artificial intelligence (AI) in journalism (August/
September) raises important issues. As he says, we journalists need to question our part in any uses of our work in AI. In particular, he points to ‘inauthentic photorealistic pictures’ and urges photographers to avoid licensing their images for generative AI uses. He also discusses SCOOP, a venture
between our union and three collecting societies, the Authors’ Licensing and Collecting Society (ALCS), DACS and PICSEL, and raises concerns about secondary uses through this that may include AI. A lot has happened with AI and
journalism since I first became aware of its uses, when the London School of Economics launched JournalismAI in 2019, which aimed to inform news organisations about the opportunities offered by AI and foster debate about the editorial, ethical and financial implications of using it in journalism. That and other initiatives, such as
the work done by the Reuters Institute at the University of Oxford, have helped me and other journalists pick our way through a rapidly changing and bewildering landscape. As Wiard says, its use is an ethical issue that we cannot duck – we need to examine every new initiative closely to see if we feel it is ethically acceptable. Yet, as Press Gazette reported in May,
while AI is being used in news outlets, its impact is not clear. Two recent studies painted different pictures, with one, ‘Large language models, small labor market effects’ by researchers from the University of Copenhagen and
the University of Chicago, finding that while reporters are using AI tools, this has not driven down earnings nor taken jobs. However, a survey by Pressat of 2,000 journalists worldwide found that well over half (57 per cent) of respondents believed that AI could mean job losses. I and other journalists raised the ethical issues around AI use by news outlets at a recent roundtable discussion organised by The New Statesman with ALCS. Entitled ‘How do we create a sustainable future for freelance journalism?’ participants addressed how AI is affecting creatives, particularly freelancers. We highlighted how freelance incomes could be driven down further as news organisations use our content for free to feed AI learning models. So how to make it more fair?
As well as having been a freelance journalist for decades, I became a non- executive director of ALCS last year. Simply put, ALCS collects then distributes money to creatives for secondary uses of their copyrighted work. As secondary uses evolve, it is necessary to respond. Working with the Copyright Licensing Agency and other collective management organisations, ALCS is developing various licensing models to ensure creators’ AI rights are recognised, protected and remunerated. ALCS has been working
closely with the NUJ. Launching SCOOP in December 2024 at the
theJournalist | 19 “
We need to examine every new initiative closely to see if we feel it is ethically acceptable
same time as a survey of over 13,500 ALCS members, it showed that writers overwhelmingly support some form of remuneration along with choice and transparency when their works are used to train AI systems. This was followed by a survey of NUJ members on various AI-related issues, such as what we want SCOOP to do and whether that should include licensing permitted uses of our work for AI. I think we are more powerful if we
act collectively – and that could include licensing permitted uses that we deem ethical as journalists. A voluntary approach that gives us the option of being paid for secondary uses – including AI if we want to – is most likely to succeed with the power of the collective. It means we would be paid for uses that are not currently covered. Richard Combes, deputy CEO of ALCS, explains: “The focus for SCOOP will be ensuring that freelancers collectively have the option to secure payment for works that have already been used by AI companies, either through unlawful online scraping or under licences entered into by publishers.” We are still in the foothills of AI in terms of ethics, and Wiard’s caution is important because there are so many things to consider. But AI is another tool for reproducing our work so it makes sense to use established ways in which to license secondary uses so all of us can be paid fairly for ‘copying’, wherever it happens.
Katharine Quarmby is a non-executive director of ALCS and specialises in environmental investigations.
Page 1 |
Page 2 |
Page 3 |
Page 4 |
Page 5 |
Page 6 |
Page 7 |
Page 8 |
Page 9 |
Page 10 |
Page 11 |
Page 12 |
Page 13 |
Page 14 |
Page 15 |
Page 16 |
Page 17 |
Page 18 |
Page 19 |
Page 20 |
Page 21 |
Page 22 |
Page 23 |
Page 24 |
Page 25 |
Page 26 |
Page 27 |
Page 28