Advertisement

AI is Hollywood’s ‘Napster 2001 moment,’ entertainment lawyers warn

Annie Murphy sits on a couch in "Black Mirror."
Annie Murphy stars in an AI-inspired Season 6 episode of “Black Mirror.”
(Nick Wall / Netflix)
Share via

Artificial intelligence is a sticking point in the months-long Writers Guild of America strike. It’s also among actors’ concerns cited in ongoing negotiations with Hollywood studios.

Always-prescient “Black Mirror” tackles these fears in the first episode of its new season.

The Season 6 premiere follows tech executive Joan (Annie Murphy), who is horrified to discover that streaming platform Streamberry — a thinly veiled parody of Netflix — has released a prestige drama called “Joan Is Awful,” which not only parallels her own life, but uses AI to generate content by surveilling her through her phone. After a sympathetic lawyer (Lolly Adefope) explains that Joan unknowingly consented to this by accepting the platform’s terms and conditions, she resorts to increasingly unhinged behavior in an attempt to shut it down.

Advertisement

Hollywood studios are showing growing interest in artificial intelligence, but the industry’s organized labor groups have qualms about automation.

While the episode’s premise that AI can create layered realities leans toward science fiction, SAG-AFTRA anticipates that Joan’s reality is closer than we think.

“Imagine waking up to find you are the face of a new advertising campaign — and it’s a product you don’t want to be associated with,” an August 2022 statement from the union reads. “As technology has evolved, artificial intelligence-powered software has made it possible to create realistic audiovisual, video and audio content known as ‘deepfakes.’ It makes the above scenario not only possible, but a real threat to those who sign broadly written non-union contracts that allow for unfettered use of a performer’s image or voice.”

AI’s ability to generate video lags significantly behind its audio and text capacities, said Ryan Schmidt, a partner at Bowen Law Group, so it will likely be a while before AI can produce anything as complex and realistic as “Joan Is Awful.”

Advertisement

“But there’s every reason to think it can get there,” Schmidt said, and the law is already running behind.

AI won’t replace your favorite artist. But it will render obsolete some working musicians and songwriters. A nervous industry steels itself for more upheaval.

Lawsuits related to AI are often settled quickly — leaving the courts unable to establish legal precedent.

“That sucks, because we don’t get an answer from the court as to how we move forward with this,” said entertainment lawyer Wynton Yates of the Yates Law Firm. “It works out for the people who are the plaintiffs in those lawsuits because they are compensated … but as a whole, we don’t get answers.”

Advertisement

Left without case law or legal precedent to draw from, lawyers generally apply two classic legal concepts when tackling AI and voice and image rights: the right of publicity and copyright.

The right of publicity prevents the commercial use of an individual’s name, likeness or other recognizable aspects of their persona without their consent, and it is increasingly dominating the dialogue around generative AI as many studios now require performers to grant digital simulation or digital creation rights in their contracts.

Such exploitation is especially prevalent in reality television, said entertainment lawyer Leigh Brecheen, partner at Brecheen, Feldman, Breimer, Silver & Thompson.

“They literally try to own who you are,” Brecheen said.

Caryn Marjorie, an influencer with 2 million followers on Snapchat, recently made a digital clone of herself. But what happens to social media creators when the robots come for their jobs?

Reality TV contracts typically require cast members to consent to more sweeping rights waivers, Brecheen explained. Often, the production companies become partial owners of any business someone creates as part of the show.

“In the past, I have had huge fights over those sorts of provisions,” Brecheen said. “I don’t think they were created with AI in mind, but they certainly would apply to AI.”

Under the National Labor Relations Act, companies are required to bargain with SAG-AFTRA before attempting to acquire these rights in performers’ contracts, the guild said in a March 23 statement about simulated performances.

Advertisement

In many cases, Schmidt said, preexisting contracts are being “reinterpreted and stretched probably beyond everybody’s original intention” because of their overly broad nature.

“That’s really the basis for the things that we’re seeing with both WGA and SAG-AFTRA,” Schmidt said. “We’re at this Napster 2001 moment where the music industry was, where we can either create really clear, fair standards, or we can kind of let it go wild. … It’s one thing if somebody wants to [use AI] to make a silly TikTok video, but it’s another thing if a film studio wants to do it and displace thousands of crew members’ jobs.”

From 1999 to 2002, file-sharing service Napster faced off against a slew of record labels and big-name artists like Metallica, who accused Napster of illegally distributing copyrighted material. Snowballing legal costs and mass resignations led Napster to file for bankruptcy in June 2002.

Now, with AI in the equation, the concept of ownership has become convoluted.

AI filters recently popularized on TikTok, for example, pose danger to an individual’s right to their own likeness, Yates said. When someone inputs their likeness into an AI-powered image generator, that input and the resultant image are now in the public domain and free to use, by anyone, for any purpose — commercial or otherwise. The new image is not protected by copyright law.

Even in cases where copyright protects artists or other creators whose property is used to train programs such as DALL-E and ChatGPT, tech developers say it isn’t feasible to trace their training materials back to their rightful owners.

“That is what these creators and developers are leaning on: not being transparent,” Yates said. “It would be a massive headache to try to figure out everything that was used and how it was used and where it came from and whether or not it fits within copyright infringement. And they’re just very lazily leaning on that as an excuse.”

Advertisement

With Big Tech shirking responsibility for reining in irresponsible and illegal use of AI, Yates says violations are only getting more prevalent and it’s inevitable many will go to trial. He’s just not certain when that might happen.

Nor is he certain about how such cases will play out.

“I could say one thing because I feel like that’s what would make sense, and tomorrow, it could come out and go completely left, and I won’t even be shocked,” he said. “I just hope we can figure it out.”

As does SAG-AFTRA.

The guild has called for the new labor contract to include terms regulating when AI can be used, for how much money and how studios will protect against misuse. The current contract expires at 11:59 p.m. Wednesday.

Hoping to avoid a strike, members of the Alliance of Motion Picture and Television Producers discussed potentially bringing in a federal mediator to help move the two sides toward a compromise.

Brecheen said she is optimistic that the guild will be able to sort out financial issues such as compensation for use of image and likeness, and residual rights. However, she also said the guild may have to be more modest about what limitations it can place on AI.

“The attempt to sort of step in front of the train and say, ‘Well, you have to use an individual actor in every crowd,’” Brecheen said. “That ship has sailed … [Saying] ‘you can only use it in certain ways’ or ‘you use it to augment a human performance’ [and] the human still has to be compensated — I think that is the way to go.”

Advertisement

SAG-AFTRA’s chief negotiator, Duncan Crabtree-Ireland, has promoted this “human-centered” approach to AI, saying the industry should not ban AI entirely but use it to guild members’ benefit.

If regulated properly, AI could revolutionize the entertainment industry. Actors could increase profit opportunities by licensing their likeness, productions could employ more special effects at a lower cost, and studios could automate pre-production and post-production tasks.

If left unchecked, though, it could put thousands out of a job — and leave the industry void of the human touch.

“Arts and entertainment is such a uniquely human thing,” Yates said. “Art brings us joy. Art brings us sadness. Art brings us an escape.”

“Why are we being so quick to give it away to artificial intelligence?”

Advertisement