UK (University of Reading) Expert comment: ‘Performers not protected from AI imitation’

‘We absolutely need better rights as to what happens to your voice or your face or your body when it’s recorded in digital media.’

These are the thoughts of Dr Mathilde Pavis, of the University of Reading’s School of Law, in response to a legal battle between performer Greg Marston and IBM after the former discovered his AI-manipulated voice was being used online without his knowledge.

Dr Mathilde Pavis has teamed up with the union Equity to create a guide for stars of the screen that will help them retain rights to their likeness. Commenting on the emerging controversy, Dr Pavis said: “The UK legal framework is not well designed to protect performers from unauthorised imitations of their work using AI technology. The best things performers and content creators working with them can do is educate themselves on their rights and use contracts smartly.

“Today, the UK legal framework is not well designed to protect performers because it is made up of a complex patchwork of different laws. This impacts performers who want to work with AI technologies and those who want to stay away from it altogether. Current laws are difficult to navigate for performers, technology developers and for media production companies. As it stands, it is a lose-lose situation.

“Performers are amongst the creators most vulnerable to unauthorised digital imitations generated because their work not only contains their intellectual property (their performances) but it also captures their personal data like their face, their voice and their body.

“In this regard, performers’ interests should be the government’s top priority when it comes to reforming the law in light of new AI technologies. All citizens stand to gain from the protection the law extends to performers because it will create important industry standards on what technology developers can and cannot do with people’s voices and likenesses.

“Protecting your legal and economic interests by using a contract adapted to AI work is not about being an “innovation sceptic” or “anti-AI” – to the contrary. Performers will need to put in place the same type of measures (like contractual clauses) to protect their interests whether they want to collaborate with AI companies to develop new products or avoid being involved in projects which feature AI technology.

“The Equity toolkit will work well to set new industry standards with AI companies keen to “do the right thing” and self-govern in this way. Law reforms and new regulations will still be needed to make sure performers, and other content creators, have the means to protect themselves in cases where their work will be misused by technology users or market players not sharing Equity’s vision for AI.”