Artificial intelligence (AI) programmes, like OpenAI’s now-ubiquitous ChatGPT, appear to be one of 2023’s hottest chat topics for businesses and individuals alike. Alongside whether AI will edge us all out of the labour market, one of the biggest questions is whether the UK’s current legal and regulatory environment is equipped to handle AI-related issues. In this blog we’ll consider how AI questions the adequacy of UK copyright law.
What is AI?
‘Artificial intelligence’ is a broad term that covers various types of computer programmes designed to perform in ways that mimic human cognition and intelligence. These range vastly in complexity. The AIs currently in vogue are generally ‘machine learning’ models, which use algorithms to learn from many examples of a specific type of content (eg written information or images). This learning process is referred to as ‘training’ an AI. These trained AIs can then make predictions and extrapolations based on their learning to, for example, generate new content in response to a user’s prompt.
What is copyright law?
A copyright is a type of intellectual property right (IPR) ie an intangible asset, that arises automatically when an original work is created (eg a work of visual art, literature, or programming). If somebody owns a copyright in something they have created they can enforce this right via the courts if somebody else infringes it (eg by copying the artwork subject to the copyright). Copyrights are largely governed by the Copyright, Designs and Patents Act 1988 (CDPA).
Certain criteria must be met for somebody’s use of a work to constitute infringement of another person’s copyright in that work. Copyright infringement generally occurs when:
- the work in question is capable of having copyrights exist in it (eg if it is an original intellectual creation that is expressed in an identifiable manner)
- the relevant copyright in the right has not expired (copyrights last for certain, defined time periods)
- the work has been copied (ie the new work was not created independently and there is a causal link between the original copyrighted work and the new work), and
- the new work is objectively similar to copyrighted work, whether this means it is exactly or substantially the same
Note that there are certain defences and exceptions to copyright infringement (eg when a work is copied for the purposes of criticism or non-commercial research).
Copyright infringement can also occur when someone deals with an infringing work (eg by selling it). This is referred to as secondary infringement.
Why is AI problematic for copyright law?
AI is by nature innovative and, in its current forms, it challenges the established concepts of copyright law.
Using copyrighted works for training vs in generated content
A recent legal situation that illustrates copyright law’s AI-related challenges is the allegation by Getty Images, a stock image sales platform, that by using Getty Images’ images to train its image-generating AI an AI company has infringed copyrights existing in the images that Getty Images owns or represents on its platform.
At first glance, this situation highlights the importance of businesses ensuring they are legally allowed to use the data they train AIs on. For example, a business may obtain licences to use works, such as those held in databases like Getty Images’, for the purposes of training an AI. If the AI later generates new works informed by the training data that are so similar as to potentially infringe copyrights in the training data, issues may arise. For example, if a licence agreement permits the use of copyrighted works to train an AI but not the creation of new works by the AI based on the training data.
Does an AI create truly original works?
Some of the questions inherent in AI-related copyright disputes will be similar to those posed by traditional copyright disputes. For example, at what point does a work cross the line from being inspired by or in the style of another creator to being a copy of that first creator’s work? As AI is literally trained to create works based on others’ works, you’d think that this question could be particularly relevant to alleged AI copyright infringements. However, it could be argued that human creators are essentially trained on others’ works in the same way as AI, but on a smaller scale, via education.
There may also be arguments that, as it is (depending on who you ask) probably not yet actually sentient, an AI is not capable of true originality and, therefore, its output is extremely vulnerable to copyright claims.
Can it be held responsible for infringing a copyright?
AIs are not currently considered legal persons and so can’t be litigated against. It’s likely that those responsible for an AI’s output will be responsible for any copyright infringements the AI performs, but it’s been recognised that identifying who is liable in such situations is not always straightforward. For example, responses to a UK Intellectual Property Office (IPO) consultation considered that clarity would be beneficial and perhaps liability should be assigned to whoever derives benefit from an infringement.
Users of AI should also be aware of the risk of secondary copyright infringement. For example, if an AI user commercialises the output they’ve obtained from an AI, knowing that that output may infringe another person’s copyright, they may be liable for copyright infringement. AI companies generally do not offer protections to their customers if their customers end up in such situations.
Can an AI own a copyright?
Another issue that arises is who exactly is capable of generating and owning a copyrighted work. Current UK law does offer some guidance. For example, section 9(3) of the CDPA provides that the first owner of a copyright in a computer generated work is the person who made the arrangements for the work’s creation. This may be the person or company who created and trained an AI that made a work, but identifying distinct parties responsible for a creation made by a complex model may prove difficult in practice. The degree of input of an AI user who gives an AI a specific instruction should also be considered.
It’s also unclear whether an AI is capable of creations that are original enough to be protected by copyright. Arguments could perhaps be made that an AI user has exercised creativity in their instruction of an AI and that the AI is simply a tool that the human user has used to create an original work, in which the human user owns a copyright.
Is copyright law adapting to deal with AI-related disputes?
The UK Government has been planning and consulting on the future of AI regulation in recent years. For example, there have been proposals to expand the text and data mining exception to copyright so that it also applies to uses of works for commercial purposes. This could allow businesses to use copyrighted works available on the internet to train AIs without needing licences to the works. This suggestion has understandably received criticism, in particular from stakeholders within the creative industries who would usually receive compensation for the use of their creative works. Consider also that the AIs that would be trained on these creatives’ works could then become the creatives’ competitors, offering customers a more efficient method of producing creative works and edging human creators out of the market.
Other grey areas of copyright law’s management of AI are also yet to be resolved. It’s likely that future legal cases, including the Getty Images case mentioned above, will give the courts a chance to clarify these. For example, cases may reconsider what exactly constitutes originality and how it’s determined who is responsible for arranging the creation of a new work if AI continues to be considered a tool.
AI’s interaction with other types of IP is also under the spotlight. For example, a recent UK supreme court case (awaiting judgement) investigated whether an AI can be named as the inventor on a patent application.
With AI capabilities and commerciality developing at pace, it will be interesting to see how these questions resolve and how the law keeps up (or fails to do so).
Do I need to protect my IP against AI?
If you’re a business working with AI, either training AIs or using their output, it’s important to be aware of the IP implications of your activities. There are steps you can take to help manage these uncertain situations.
For example, if you’re on either side of an IP licence agreement for AI training-related purposes, you can use your agreement to explicitly deal with issues like whether IP is being licensed for use in training the AI only or whether the IP can also be used as recognisable components of the AI’s output (eg in a way that may otherwise infringe copyrights).
If you have questions about IP and AI, or you would like help protecting your creations or legally using someone else’s IP, you can Ask a lawyer for assistance.