copyright
A battle is won, but not the war
Plans to give AI firms free rein on creative works have been halted, but we still need safeguards, says Andrew Wiard
J
ournalists and other creators have scored a major victory over Big Tech.
In its Report on
Copyright and Artificial Intelligence issued in March, the government announced that it would not be pursuing its text data and mining exception to copyright law. This would have declared open season on using our work for artificial intelligence (AI) training unless we somehow ‘opted out’.
The proposal was an attack on our
rights as copyright creators, to refuse permission, or demand payment for the use of our work. Not only was this wrong in principle but also, for most of us, opting out would have been impossible and unworkable. Faced with spirited defiance from
the ‘creators’ champion’, film-maker Baroness Kidron in the Lords, the Creators’ Rights Alliance (which includes the NUJ), Paul McCartney, Elton John and people from the world of arts and entertainment in general, the government caved in. Just to illustrate the danger of what
might happen without organised resistance, on the next day the European Parliament proposed compulsory licensing of copyright material for AI use. Not even an option to opt out. That is press ganging creators into
supplying Big Tech and Silicon Valley whether we like it or not. But that won’t happen over here as long as Baroness Kidron is still leading the charge. So, the battle is won for now – but not the war. They’ve told us what they won’t do but not what they will. They have only said they will not ‘take forward’ their preferred text data and
16 | theJournalist
mining exception, not that they won’t consider alternatives, or even return to it. Crucially, we cannot possibly defend
our rights without transparency. That’s the word used to describe whatever procedure can be agreed for making AI companies declare what they have scraped off the internet or otherwise taken for AI training. We want transparency; they don’t. It requires judgment, decision and legislation. Instead, nothing – the government has just kicked that can down the road. Without transparency, licensing will be extremely difficult if not impossible, whether directly or through collective licensing. Collecting schemes run by ALCS and DACS would be the simplest solution for NUJ members who choose to opt in. I myself will not be licensing my pictures for generative AI to create fake news.
Speaking of which, the government is taking this seriously, through the use of ‘labelling’ for ‘input transparency’, identifying work generated by AI – something along the lines of the EU’s Artificial Intelligence Act which should come into force later this year. Though here, again, they do not propose any specific action. The alternative (or rather complementary) approach is to label authenticity, guaranteeing provenance. Photographers have
already been planning this through the Coalition for Content Provenance and Authenticity (C2PA) with a clickable CR
“
(Content Credentials) icon/pin in photographs revealing their origins. In January, the Society of Authors launched the latest authentication scheme, using a Human Authored logo ‘to help identify works written by humans in a market increasingly flooded by AI-generated books’. We don’t need to wait for legislation to deal with all this (although that could help). We can simply do it ourselves. Legislation will be essential, however, for the most radical idea to come out of this report – controlling ‘digital replicas’ of someone’s voice or face. Problems range from all too credible impersonation of public figures making misleading, outrageous or downright dangerous statements on social media to exploiting the likeness of actors for commercial gain. Last December, actors’ union Equity held an indicative strike ballot producing an overwhelming majority for refusing digital scanning on set. Their work, voices and likeness are being used without their explicit consent. Quite simply, that should be outlawed. But there’s nothing simple about that. But time is running out to safeguard
The government should not be trying to ‘balance’ the rights of AI thieves and creators
the intellectual property rights of creators, and to hold Big Tech to account. We are facing theft for AI on an industrial scale. The government should not, in the name of innovation, be attempting to ‘balance’ the rights of AI thieves and creators. It’s time for the government to get off the fence. It’s time for it to act – now.
Page 1 |
Page 2 |
Page 3 |
Page 4 |
Page 5 |
Page 6 |
Page 7 |
Page 8 |
Page 9 |
Page 10 |
Page 11 |
Page 12 |
Page 13 |
Page 14 |
Page 15 |
Page 16 |
Page 17 |
Page 18 |
Page 19 |
Page 20 |
Page 21 |
Page 22 |
Page 23 |
Page 24 |
Page 25 |
Page 26 |
Page 27 |
Page 28