Podcast Banner

Podcasts

Paul, Weiss Waking Up With AI

Deepfakes After Death

In this week’s episode of “Waking Up With AI,” Anna Gressel considers audio deepfakes and the complex legal and regulatory questions that arise when a deepfake uses the likeness of the deceased.

Stream here or subscribe on your
preferred podcast app:

Episode Transcript

Anna Gressel: Hey, everyone. Good morning, and welcome to another episode of “Waking Up With AI,” a Paul, Weiss podcast. I'm Anna Gressel, and I don't have Katherine with me today, but we have a lot of interesting things to talk about, starting with a few big policy developments. I don't think we'll cover them in detail, but I wanted to mention we're sitting here at the end of January 2025 and it's a real moment from the AI policy perspective.

The Biden administration's executive order is officially out, it's been rescinded. And now we have a Trump administration executive order focused on US innovation and competition. And under that executive order, the leadership of the new administration has 180 days to essentially decide what to keep and what to get rid of from the last administration. So we're going to be in a period of flux for a while. And this tells us that it's possible we'll see some major policy developments even before that 180-day period elapses. But it essentially sets the clock running. So for folks doing their 2025 AI calendar planning — and I know that's a lot of our audience — set your reminders that this summer will likely see some really interesting developments on the US side of the pond in addition, of course, to some major milestones with the EU AI Act. 2025 is definitely off to an interesting start.

But putting that aside for the moment, we're turning back, today, to an important legal issue, that is, likeness rights related to people who are deceased. And this isn't like bringing people back from the dead, although I know we could have done a whole Halloween episode on that. But today we're going to talk a little bit about audio deepfakes. I mean, these could be any kind of deepfakes, but let's just take audio as an example for today, which are really not new technologies. And I know Katherine wrote about some particularly thorny scenarios on audio deepfakes all the way back in 2021 for her column in the New York Law Journal. And specifically, she was focused on scenarios where someone's voice was made to say something they didn't say after they had passed away.

So that was before Katherine and I worked together, but we were friends and going to conferences at the time. And I remember that Roadrunner, that 2021 documentary about the late Anthony Bourdain's life, had just come out and the director revealed that a few of the clips of Bourdain speaking were AI-generated. The clips of Bourdain speaking happened to be vocalizations of text he'd already written. That's not quite as egregious as making up new content, but it is still fake, and the audience didn't know ahead of time that there were any AI-generated clips. More recently, a Mexican beer brand came out and started offering customers the chance to create deepfake short videos of their deceased loved ones in honor of Mexico's Day of the Dead. So that’s been prompting all kinds of questions about whether there should be safeguards to prevent bad actors from inappropriately using those services to impersonate and defraud people.

Stories like these raise so many interesting questions from a legal and regulatory perspective. Historically, US states have had varying degrees of protections for what they call publicity rights: basically the right to use someone's voice, image or likeness, usually in a commercial context. A lot of those laws typically come up when you need a celebrity's permission, for instance, before you show them endorsing a product. And the exact nature of those protections aren't the same across states. This is like a real 50-state patchwork. But the right of publicity doctrine came into being way before any of these deepfakes hit the scene. And it's worth noting that some states have specific right of publicity laws covering deceased persons, which can differ in really material ways from their general right of publicity laws. We're also starting to see a number of states like Tennessee, through its Elvis Act, creating specific digital replica laws that try to make clear that AI recreations of someone's voice or likeness are still covered under the right of publicity protections I just mentioned.