Way Too Interested
Episode 3

DEEPFAKES with Rex Sorgatz and Henry Ajder

The words, "Podcast: Way Too Interested" over a collage of symbols representing hobbies and interests, including a professional wrestler, a video game controller, and a surfing woman. Below the collage, a yellow face peeks up at the viewer.
Writer and web designer Rex Sorgatz (The Encyclopedia of Misinformation) joins Gavin Purcell to talk about his obsession with deepfakes, why he's less worried about them now than he used to be, and why he believes a dead actor will win an Oscar someday. Then, deepfakes and synthetic media expert Henry Ajder joins to explain whether we should be worried about deepfakes — and how much — and to talk about "shallow fakes," revenge porn, and the perverse way the existence of deepfakes has affected our perception of real media.

Please follow or subscribe to Way Too Interested in your podcast app of choice! And if you liked this episode, then tell a friend to check it out.

Also ...

Follow Gavin on Twitter @gavinpurcell

Read Rex's book, The Encyclopedia of Misinformation: A Compendium of Imitations, Spoofs, Delusions, Simulations, Counterfeits, Impostors, Illusions, Confabulations, Skullduggery, Frauds, Pseudoscience, Propaganda, Hoaxes, Flimflam, Pranks, Hornswoggle, Conspiracies, and Miscellaneous Fakery

Learn more about Henry's work and research

Transcript - Part 1
Click to expand
GAVIN PURCELL: Today on Way Too Interested, I talk to my friend Rex Sorgatz about his fascination with deepfakes. Come join us, won't you?

[theme song]

GAVIN: That was the Gregory Brothers, and this is Way Too Interested, the new podcast where I Gavin Purcell talk to people that I'm interested in, about things they can't stop thinking about. Then in the second half of the show, my guests and I talked to an expert in that particular subject, and we do a deep dive.

It's a show about curiosity, creativity, discovery, and more importantly, pursuing those little tiny things that end up being way, way, way more interesting than you ever expected. My name as I said is Gavin Purcell, and I've been interested in way too many things for way too long. I hope this is my opportunity to learn about a lot more stuff that I don't know about, and I hope you do as well.

Each week at the top of the show, I'm going to talk to my guest a little bit about their process for discovering new things, not just about the particular topic at hand. Everybody has a different way of learning and opening themselves to new ideas, and that is really interesting to me as well, and I want to explore that with what we're doing. I'm a big believer in the idea that pursuing and following our interests makes us better people, no matter what, and I hope that this show encourages you to do that. And I hope that you're able to learn a little bit about what subject matters we're talking about.

Today, I'm very excited because my guest is an old friend of mine, Rex Sorgatz. Rex has been around the internet for quite a while, he and I met each other a long time ago, and here are three interesting facts about Rex...

Number one, I discovered Rex and met him way back in the early 2000 through his excellent, excellent, very early to the game blog Fimoculous. He was very good at blogging and still is when he does it. He had a pretty nice newsletter, not that long ago called Rex that was out, but I hope he gets back to it.

Number two, Rex wrote the very excellent book, The Encyclopedia of Misinformation, which is one of those books, you can kind of open to any page and learn something new. It's also deeplinked throughout. So reading is kind of like a little bit of choosing your own adventure, it's like as close to a web page experience as you can get. I encourage you to go by the book. You'll hear us talk a little bit about it in the podcast, but go check it out.

And then number three, at one point in our lives, both Rex and I, this is prior to the ubiquity of YouTube incorporation saying it was everything okay, both Rex and I attempted to get the entirety of the Saturday Night Live catalog online. We got our ways down the road, I pivoted off to work on the Late Night Show and Rex stayed and worked on it a little bit further, but unfortunately, it never came to fruition. However, the design and the setup that Rex had made for that site were excellent. You'll never see it, but he did a fantastic job because that is his other job, because he's an excellent web designer and developer.

Okay, you can find Rex online at @fimoculous, his website Fimoculous, which still is up, and for sure, go check out his book, The Encyclopedia of Misinformation. But for now, here is my interview with Rex Sorgatz and his choice of topic, deepfakes.

Rex, welcome to Way Too Interested, this is the very first legit recording that I have done for this show. And honestly, probably the first professional audio recording, well professional might be too far, but the closest to a professional audio recording that I've ever done, so you're welcome. Welcome for being here, thank you so much, I appreciate you joining.

REX SORGATZ: I am super stoked, this is exciting, to be the first of anything.

GAVIN: Oh, great. Well one of the biggest reasons I started this show, I think is some of the things that came out of a place that I discovered you, which was the kind of the mid-2000s blogosphere. One of the things I loved about back then was just how you could kind of get lost on the internet for the first time in a weird way.

Obviously, the internet had been around for a while and I've been doing some sort of internet stuff for a very long time, but Fimoculous was a great place to kind of learn and discover new things. I was just tweeting with Anil Dash today or like Jason Cocker, or all these different people from back then, but you kind of got going in that scene and it seems like a kind of a special thing. What do you attribute that kind of the specialness of that time to?

REX: Yeah, I'd say early 2000s is about the right timeframe you're talking about and blogs were just becoming a phenomenon, the word had only been invented in like '98, I think. And I was just a weird, curious kid in the midwest who got fascinated by internet culture and dove in via opening a blog very early. It had a weird name and I could have had like any domain at the time. Instead, I had to pick this weird word that people still can't pronounce right and don’t understand what it means.

GAVIN: Did I get it right, is it Fimoculous?

REX: Yeah I think so, it’s a version of a coronavirus, in a way. It's a microorganism that consumes its own excrement for sustenance, and I thought, "That's a great metaphor for the internet!" And yeah, I've always just been generally interested in emerging internet cultures.

GAVIN: Great. And you wrote a book, which I also want to talk a tiny bit about before we get into the topic we're gonna talk about today. And the book is called The Encyclopedia of Misinformation, which is great, everybody should go grab it. It's still over there at Amazon and it's a really fun read. Do you know uncle John's, the Uncle John's books at all Rex? Have we ever talked about this, Uncle John's Bathroom Reader?

REX: I do know those, yes. I like when people compare my book to Bathroom Readers. [laughs]

GAVIN: I will say this is the thing, Uncle John's Bathroom Reader is probably one of my favorite books of all time, and I think it's just my brain is this. The greatest thing about it is you open it up to any page and it's got like three different sizes of reads for how much you're gonna spend in the bathroom, but each page has kind of this interesting thing. One of the things I love about books like this, is it's kind of that thing too, right? You even said it in the intro, you can kind of open it to whatever page you want and kind of learn something different. But this book, particularly, is about misinformation, which will kind of lead into our topic a little bit. Why did you want to write about misinformation?

REX: Well, I'd love to say that I was ahead of the time and I saw the emerging political moment, but I actually signed the book deal before the 2016 election. When I signed the book deal, the world turned into chaos, our media environments suddenly were suspect and polluted.

It was really hard to write because the topic just became huge. I don't think I've ever told this story: It was originally called the Encyclopedia of Fakery, and that was what it was sold as. And we changed it to misinformation because the subject suddenly became more serious. But if you pick up the book, you can tell that there's like elements in it that are still jokey and fun. And yeah, like the subject matter suddenly became serious.

It's interesting, the thing that I want to talk about is something that did not make the book because it emerged just like weeks later. And I actually asked the publisher if we could update it with just like one additional entry. So the book is literally an encyclopedia I should say that, but they're short entries about just little interesting things and in many ways, it's very bloggy. So I tried to get the subject matter that we're going to talk about into the book, but it just barely missed it.

GAVIN: All right, so let's get into that. I'm going to do this with every one of my guests, and it's going to be super annoying, but I'm going to ask you to say it in this exact way. I want you to say, "I [state your name] am way too interested in [blank]." This is going to be something I'm going to try with everybody and we'll see how it goes, but I'm definitely forcing you to do it.

REX: This is exciting. I, Rex Sorgatz, am way too interested in deepfakes.

GAVIN: Deepfakes, all right great. I am very interested in deepfakes as well. Clearly they have had a moment and the moment continues based on a lot of the misinformation stuff we've talked about. Before we get too far into this, though, because I have a feeling that whoever's going to be listening to this podcast probably knows what they are, but just in case, can you give like a general definition of what a deepfake is?

REX: Yeah sure. It's really simple, and too bad this is a podcast, it's one of those things to show and then you go "oh yeah, that thing." It's simply really just face swapping. You've seen those videos probably where like this famous one of Tom Cruise going around at this moment where it really looks like him. And you have some vague idea that there's some fancy technology that's come around in the past year to allow the creation of these. It's increasingly being used by Hollywood, but it's also interesting that there's like apps out there that you can use and it really runs through culture pretty wildly in the past 18 months or so.

GAVIN: Yeah, one of the things that I find, those apps are really interesting because it's made it so much more widespread. It's become a much easier thing for people to do. I think in the beginning, when I first saw them, it felt a little bit like, oh this is like Hollywood graphics effects, or not even Hollywood guys, but like people that spend a lot of time on their after effects programs were able to do it. And now it just feels like it's pervading a lot more. Do you remember ... Because to me, there's like two different things, there's the term deepfake, and there's what you would define as a deepfake. And I think they're intertwined in some ways, but what was the first time you heard about a deepfake or what deepfakes were?

REX: Yeah, I'm a creature of the weird parts of the internet and spend vastly too much time on Reddit, and because of that, I discover subreddits really early that are weird and interesting. And one of them was called /r/deepfake, and it's just such an interesting word unto itself, like it sounds mysterious.

I remember going to it and there were videos in it. And if I remember right, the person who was producing the content for it had actually coined the term, and I could be wrong about that. That might be a good question for our expert. There were videos and I'm pretty sure the first one I saw was porn because it was Reddit and that's the internet.

I think the canonical first example is a video of the actress Daisy Ridley, her face is lifted up and put on some porn scene. The first time you see it, it's instantly controversial and weird. The thing that really is remarkable about it is how realistic they look and whoever's doing these first ones, you immediately knew someone had mastered some technique and the world instantly felt different because of it.

GAVIN: What's interesting to me is how we've been seeing this in movies and TV kind of forever, right? In some form or another. I think about Forrest Gump or one of those old movies where they digitally place somebody into something, or we now have more recent experiences of seeing a much more significant versions of this, like in the Irishman or different places like that. But what about this made it different? Do you think it had to do with the fact that it was done by an anonymous person and that it felt like it was able to be done by somebody rather than like a giant army of people?

REX: Yeah, it was instantly like the democratization of the whole thing. Because the hidden promise of the whole thing was that seemingly anyone could now make these, and that it wasn't just CGI factories in Hollywood.

Because there were examples in Hollywood of Hollywood doing this, probably the most famous one was in Rogue One, they de-aged Carrie Fisher. They also brought back the actor Peter Cushing, who had died, I think it was like 39 years later, he was playing himself again.

It’s kind of a weird cultural moment for everyone to experience this person who's dead, now back on the screen, and that happened right before deepfakes started to emerge. And so I think in a way you could say like Hollywood was priming us for this moment where faces became suspect and anyone could be anyone. So that was the interesting thing was that suddenly we were in a moment where anyone could make these, and it felt freaky.

GAVIN: Yup. You know what I remember as my kind of “oh shit” moment in the world of deepfakes? I had the same experience, I think I saw it on Reddit and I remember it becoming a pretty big deal. But there was the video that Buzzfeed made, I don't know if you remember this, where they had Barack Obama doing a speech, but then what you heard was not Barack Obama's voice. He was using different words and they were making the mouth work perfectly. It looked exactly like Obama, but when you listen a little bit closer, it was Jordan Peele and you could hear Jordan Peele’s slight impression of it.

And that was my moment of like, wow, well this is interesting because obviously taking somebody's identity and putting them into something as disturbing as porn is, is horrifying. But then watching the leader of the free world at the time, or the president of the United States, kind of have it done to them, that was when it felt slightly more dangerous to me because of like what would they'd have been able to do. What was your experience watching that Obama one?

REX: I think it was Buzzfeed who did it and they were pretty early. I believe that was like the first major story about it. Although I think Nick Bilton at Vanity Fair wrote an early story about it too, but the interesting thing is that the media jumped on it. It's a great media story, right? Like, everything is suspect and the end is nigh.

I think most normal people saw it for the first time and also had the same reaction, which was, oh my God, this is going to lead to nuclear war. There was almost an instant kind of reaction to it that came up with doomsday scenarios. And with the administration in office that we had, one could imagine somebody releasing a deepfake of the leader of North Korea saying that he was going to bomb America, and suddenly no one realizes that it’s a deepfake and we retaliate. You could come up with scenarios where it just seemed like the worst possible thing that could happen.

At the same time, while that's going on, this subreddit that I'm obsessed with is going crazy with putting Post Malone into the office as Michael Scott. While that's going on, everyone was like doing all these goofy things with it. I think it's one of the most interesting things about it, is like, the doomsday scenario never turned out. There was a lot of warnings before the 2020 election of the same kind of thing was going to happen, that somebody going to release a deepfake of Biden, and no one's going to know it's not really him. There was all this scare about it, and none of that came to fruition really. There was a couple of things that were a little controversial, but it turned out to not be all of the bad stuff that we thought was coming. And so I think that's one of the things I'm really interested in with this.

GAVIN: And one of the other things I think about with deepfakes too is I have a hard time sometimes discerning what is a deepfake versus what is photo manipulation or a video manipulation. Maybe as somebody who's pretty interested in this, can you describe like the "deep" side of it, what is the "deep" part of deepfake referred to?

REX: Yeah I mean, the technology emerged because of an advance in artificial intelligence technology. I’m something of a technologist, so I stay abreast of these emerging things, but yeah, there's a term called deep learning, which has been in the literature for a long time. But it was borrowed by this person who started the movement, who came up with this clever term. And deepfakes is really clever, it resonates in some way, it sort of sounds like the deep stuff, it’s just a clever word. To be honest, I've never played around with it, I've never made one. I know there are apps out there that make them. I had this vague idea of like generative networks that produce them, I have some abstract idea of how they're made, but to be honest, I don't completely know.

GAVIN: Yeah, I've played around with GANs as well, GAN that stands for generative adversarial network, which people use to build models of stuff. I don't know if I told you this, but when the NFT things just happened, I was like screwing around like oh, what out of NFT can I make? And I plugged about 50 pictures of Batman into a GAN and called it BatGAN. It was a photo one, but it was like, whatever, it was fine. Nobody bought it, of course because nobody wants something interesting.

One of the things about these deepfakes that's interesting to me is, we had a lot of things with Photoshop where somebody would put somebody's face on somebody else, or even videos. One of the things that these deepfakes do really well and I think can really add a significant believability factor is audio, right? And one of the things that I get kind of shocked by is they've taken not only the video and perfectly matched it to their face, but they've taken the audio and they bring in all their voices and they're able to use those pieces of voices and pronunciations and make words.

There's a guy on YouTube called 30 Hertz, actually he goes by 30 and 40 Hertz. I don’t know why he goes by both names. But just a couple of weeks ago, as of this recording, released this recording, where he wrote new lyrics for "My Name Is" by Eminem, with things that’s happening now, and used an AI to deepake his voice in here. And I'm gonna play you guys one of the verses.

[clip from Eminem deepfake]

So okay, the interesting thing about that to me is in his voice, the AI has done a really good job of grabbing his voice, but the other thing they did there, is they clearly captured his lyrical voice. And the idea of whether you believe that Eminem's — early 2000s — lyrics are good…a lot of time they’re misogynistic and a bunch of other problems, but they captured how Eminem sounds both in his actual audio and with the sense of what he would actually say. And I think this is where this kind of crossover is getting really fascinating and weird because that's where identity gets mixed into what you can do with technology, right?

REX: Yeah, there are a handful of startups out there that are producing this, and it was part of an Adobe audio editing suite that you can do this. But I know how it works, you upload like 30 minutes of someone speaking or singing, and then through the magic of AI, it somehow develops a library in a sense of that person. Then you can just type in anything and it will say it.

It's really freaky when you first see it, it's another one of those things where it's like, “Whoa!” And I agree with you, if the thing that emerged was simply just a new kind of video that was face swapped, it'd be one thing, but that the audio came at the same time. And all of a sudden, not only did they look like that real person, it instantly sounded just like them and it was unbelievable.

You could make them say anything and you've maybe seen demos where the side-by-side camera, you can move your head back and forth. And the character who would be like Donald Trump moving their head back and forth, and it looks real. You talk and it's your voice coming up, but it's in his tone, and you can do it in real time now. It's amazing and freaky.

GAVIN: A lot of people might see this kind of thing and just go, wow, that's really crazy and freaky, and then kind of move on to their day. What about this particular technology has gotten you obsessed? Why exactly can't you stop thinking about deepfakes?

REX: I guess the first answer is I'm a weirdo. I'm interested in all attempts at subterfuge and deception, like I wrote an entire book about misinformation because of this. I'm interested in people's attempts to subvert reality. I think that that's probably the biggest thing.

The second thing though is that as I've kind of alluded to here, I think it's so interesting that the first time I encountered these, it was like whoa, this seems terrible. Then, two months later, I was like ah, this is no big deal, and I think that a lot of people are like that. And I guess what interests me is that it seemed like a microcosm of how society handles new technology, but I'm not sure if that's true. I'm generally interested in other people's reaction to them.

I'm that person, like, if you talk to me at a party, I'm going to bring up like this subreddit with these deepfakes on it and ask have you seen this newest one of them putting Tupac into SNL. I'm that kind of person who goes crazy over it.

So I'm interested in it as like a social phenomenon, primarily, I'm also interested in the tech. Like, something happened at some moment where everything leveled up. It was like, you could kind of do this before, but all of a sudden was like whoa, it got really good. I occasionally run into people who will say things like oh, you can still tell they're fake.

And then I'll like dig around YouTube and I'll find like the Emma Stone one and I'll go, you cannot tell this, this looks real, and then I convince them pretty quickly. So yeah, I do have a strange obsession with it, I still spend a lot of time on that subreddit looking for new ones and it still fascinates me.

GAVIN: One of the things I think, and I looked at this earlier too, is the idea of identity with these too, right? Because I was thinking last night about that show Years and Years, do you watch Years and Years on HBO?

REX: I loved it, yeah.

GAVIN: It's such a fascinating storyline to me because I have daughters. Oh by the way, I've only seen the first three episodes because I stopped watching it because it was too depressing to me overall; I need to go back and finish it. Maybe something happens in the storyline that I don't know, so don't tell me about it.

But the teenage daughter decides that she wants to be essentially post-human, she wants to live potentially a digital-only life. And you start to think about things like deepfakes and how in real time, like you're saying, you can really activate and become something else. I think what's interesting is going to be, and I know you're interested in this too, is the idea of like famous IP and different people being able to roleplay as people in the future. Like what is that going to mean?

And this technology, which we kind of see as scary right now, how much of this technology is going to be part of our culture in the future is really interesting to me. Because I believe whether you're role-playing as Emma Stone, or if you're role-playing as some tiger creature with crazy antenna, it's a similar sort of technology that's gonna allow you to be somebody besides yourself and doing it in real time is also an interesting thing, too.

REX: I think that the Hollywood part of it is ultimately like where I get really, really interested in. I think it's fascinating that the technology emerged at the same time where Hollywood shifted towards, depending on your view of the current state of Hollywood, producing these infinite reiterations of franchises and reboots and cinematic universes, and basically IP ... things that were developed 20, 40, 60 years ago, redone and reformatted over and over again; Marvel Cinematic Universe being the quintessential case. But everything like Game of Thrones is going to have like seven babies soon and there's going to be all these spin-offs, and that really represents the future of entertainment.

How I think this ties to deepfakes is like each of those characters is IP and they're valuable in their representation at the moment. So the scenario I always present is I'm pretty sure that at some point there will be an actor or actress who wins an Oscar, who is dead, who has been recreated through deepfakes. If that sounds like hyperbolic, what I would suggest to you is look at the list of all the things that have been deepfaked already; it's a ton of them out there. Hollywood is already doing this.

Like people forget that the Irishman, that award-winning Oscar movie with De Niro and Pacino and Joe Pesci, that has deepfakes all over it. They're de-aged a ton in that movie, and I think that's going to just become way more common. And the scenario I always come up with is, I think that there will be a point where one of the Game of Thrones spin-offs, they want to bring back Daenerys Targaryen, her as young. It could be in like 40 years from now.

GAVIN: Yeah, I was going to say is this like the 2070 Game of Thrones reboot, number four?

REX: It will be, it's way down the line, and people love that young character right? And the dragon, that first moment, everything was great. And Emilia Clarke will be too old, maybe dead, and they'll want to bring her back, they’ll want to bring back that character, and they will have the IP to do it.

There's a whole bunch of legal questions about using her face that I'm not wise enough to answer. I've talked to some entertainment lawyers about this, there are contracts out there that are being written that hand over your face, along with your likeness in a legal contract. And so everything is getting really interesting and there's going to be a thing further down the road where we continue making movies with people that are dead.

I'm convinced that in 30 years, they're going to make another Tom Hanks movie when he's not around, it just seems to make sense, people will miss Tom Hanks. They can create him perfectly. He's already been in movies where he's jumped around in time and space, and so it makes sense already. And so yeah, I think actors are going to become superfluous to the whole thing. We'll just scan their faces in and we'll just go with it. We'll make the movie, we just need you to sign this little contract and you barely have to even be there.

You ever seen that video on YouTube of Gwyneth Paltrow? She's in some cooking show with Jon Favreau and Joh=n Favreau mentions “Hey, you remember that time we were making that Spider-Man movie?” And Gwyneth Paltrow goes “What? I've never been in a Spider-Man movie,” and Jon Favreau goes, “Yeah, you were.” She could not remember that she was in a Spider-Man movie.

[clip from interview]

And of course she was, and it's laughable, and you kind of make fun of Gwyneth Paltrow for it, but at the same time, that's how movies get made now. They show up on set for three days, three days that she apparently forgot.

GAVIN: Probably while she was shooting one of the other Avengers movies at the same time, so she just walked over in the same character.

REX: Yeah, they don't need her around that much. It's like, you got everything scanned in, it doesn't look like a movie. It's like a weird green screen that you're on, you just say a few lines and you walk off, and you don't even know. And so I think that's the future of acting for a lot of these people, it's going to be their likeness reproduced in franchises, and that’s it.

GAVIN: That's crazy. So, we're going to bring on our expert as per the format of our show, his name is Henry Ajder, and he's an expert on deepfakes, he's from the UK, and this was somebody that you suggested. What do you want to know from Henry? Where do you want to go deeper on this, that you feel like you want some information filled in for you?

REX: Well, the first one is the vague wherewithal to make these, but I really don't know. I haven't made one, I know that there are libraries, I know there's even like apps out there that make them. And so I'd like to know more just about the process and how hard it is and how good they are once you make them with this app. Are they as convincing as the ones that I see on Reddit? That's the first thing.

The second thing is, I'm curious about the general flow of how society came to accept these. And if it mimics the thing that I described, which was, I call it WTF to meh, like it went from "oh God, this is terrible" to "ah, this is just Photoshop" overnight. It just felt so fast. I just think that that's interesting, and I'm curious if like maybe he went through it feeling the same way, or he’s still freaked out about it and thinks that like these things could bring an end to the world.

And then finally, like I'm interested in like this Hollywood entertainment question, and it is the thing that I bring up with people at parties all the time. I'm curious if someone who has more expertise in the area, if they have thoughts about what all this means for the future of entertainment.

GAVIN: Great. I'm actually also interested in something I saw doing some research here, which is the term shallow fakes. Have you heard about shallow fakes?

REX: I have.

GAVIN: It's a new variation on this, and I kind of want to hear what his definition of it is so that we kind of see the difference between a shallow fake and a deepfake. And kind of get a sense of what it means to have each of those things happen.

REX: Yeah, that sounds great.

GAVIN: Great, we'll be right back with Henry and Rex.

All right we will be right back with our expert interview, but before we do, I wanted to take a quick second. Normally this is where the ad in the show would go. I don't have ads because I'm just starting, and I may never, who knows? It's a beginning of a podcast, but I want to take this time to recommend some books.

There are a series of books I have on my shelf that I've purchased. Not just as books that I've read and read multiple times, all of these, but books that I find inspiring, and books I hope that you might find inspiring as well. Sometimes, these books are really hard to find in our lives, and I often love hearing what other people's books are that are inspiring to them. And so if you have them, please tweet at me with them because I love reading these books and finding my own versions or finding my books that I like, too.

Today's book is slightly unusual for me, it's a biography, and I really don't read a lot of biographies. I read a fair amount of nonfiction, but not a lot of biographies, but this is a biography of a figure that I've been a fan of, and I kind of creatively idolized for a long time. It's Jim Henson. Jim Henson who was the creator of the Muppets and Sesame Street and a bunch of other incredibly amazing things.

This is a book by Brian J. Jones and it's basically kind of his life story. Sometimes these books can feel kind of heavy and drag you down and be like well, that's a significant amount of reading for one person. But this book never felt that way, it also gave me some really good insights in just how to feel and live a creative life, and know that it's a give and take and that there's lots of stuff you're going to do in your life and to kind of continue to be working on it along the way.

So I can't recommend it more highly, it's called Jim Henson, The Biography by Brian J. Jones. Brian J. Jones is also on Twitter, fun follow, so go do that. And here, let's get back to the show. Henry Ajder, our deepfake expert is about to join with Rex Sorgatz.
Transcript - Part 2
Click to expand
GAVIN: All right, welcome back to Way Too Interested, we are joined now by our other guests, and just to remind everybody this show is about somebody who's super interested in something, and then we bring on an expert to kind of help fill in information. Today, we're joined by Henry Ajder. Henry, why don't you introduce yourself and tell us a little bit about what your background is?

HENRY AJDER: Sure. I guess I can say that I'm an expert in deepfakes and synthetic media.

REX: Perfect!

[laughter]

GAVIN: Are you sure you're real, do we know that you're real?

HENRY: Yeah, that's the question I'll ask myself most days at the moment, with the whole COVID situation. But yeah, I'm an expert in deepfakes and synthetic media. I've been researching the topic since it first emerged in late 2017. And have done quite extensive research on the space, primarily through my role as head of research at the world's first deepfake detection company.

GAVIN: Great. And before I let Rex go, what is the first deepfake you remember seeing? And the other question that came up in our other thing is — when we call it deepfake, it comes from the word deepfake. What is the first deepfake you remember seeing, and then is that different than what we had seen before?

HENRY: It's quite unpleasant, to be honest, but the first deepfake I remember seeing was on the original subreddit, when I was doing research on this space as I did. And the first deepfakes, where the term was officially coined, came from Reddit, where there was an existing community of people who used kind of traditional Photoshop tools to superimpose celebrities’ faces on pornographic images.

And then, one day this new subreddit emerged called /r/deepfakes with the user called /u/deepfakes, who let everyone know that he'd created using open source libraries of software, a new tool for synthetically swapping faces, not just in images, but also in videos. So the first deepfakes that I saw were, rather shockingly, on these subreddits, where a lot of these female celebrities who are non-consensually being swapped into pornographic footage.

But I think as you guys probably have discussed, the term was originally coined exclusively to refer to those pornographic uses. But since then, in the three years or so since that first use case emerged, deepfakes now refers to a much broader range of uses and applications of synthetic media.

GAVIN: All right Rex, go for it.

REX: Your experience mimics mine, I was an early consumer of that subreddit and I don't know if there's a canonical firstness to the whole thing, but I think the one I remember is the Daisy Ridley one and it was truly disturbing, from both the "Well, somebody definitely did not consent to this" aspect, but also, "Whoa, that's realistic, how did they do that?" aspect.

And so I guess my first question is really, is that right? Did everyone else experience it the same way, like it was just this moment where whoa, this is much more accurate and better... Was it simply just like the technology leveled up, and that's what happened, what was that moment exactly? Did I experience it the same way as everyone did?

HENRY: Yeah, that's a really good question. I think it kind of both applies to that existing moment where you and I as, I guess, researchers on this information and malicious uses of artificial intelligence. Which is how I first found out about it, we probably found it in a similar way. Which is as you kind of mentioned, the technology behind synthetic media and deepfakes has been developing since about 2015, maybe even a bit earlier and kind of as it happens with academic and industry research, it trickles down.

So what first kind of is contained within academic computer science labs or kind of industry labs starts to become open source, starts to become something that people can replicate, you know, amateur hobbyists and people like this. Especially through these open source libraries of code that Google and other platforms offer. People can start cobbling together pieces of software in a way that previously they couldn't, and that's how deepfakes as how we first saw them emerge.

REX: One of the interesting things about that subreddit early on is that it wasn't just a library of instances, it was also like a how-to forum. There was Github libraries out there you could download. And there's a lot of like Q&A type stuff that subreddit eventually got banned, understandably, but it re-emerged as safe for work deepfakes, it’s now called?

HENRY: Sort of. On Reddit, deepfakes and deepfakes creators kind of recreated the subreddit in a safe for work capacity, but the original /r/deepfakes subreddit went underground. So that just started emerging on non-Reddit forums that new dedicated like essentially hubs, as we see with more conventional pornography. So a lot of the community moved into the kind of darker spaces online.

REX: Ah, well you know a darker part of the internet that I don't, and that's good to know. There's always worse.

HENRY: There’s always worse, exactly. It can always get darker.

REX: So I guess that's my next question is — I have some technical wherewithal, I write code and stuff, but I've never played around with this. I have this vague idea that there are these apps out there that I've never installed, but I see the outcome of them. I see these things being produced all the time. I’m kind of wondering like how hard it is. How much time would I have to put in to insert Emma Stone into Lord of the Rings? You know what I mean? Like these outrageous things you see. Would it take me a few hours on this FaceTime, or one of the names of the app Deepfacelab, I think is one of them? Will it take me like a little amount time or would I have to spend hundreds of hours to do this?

HENRY: Yeah. So in terms of creating deepfakes, that's one of the areas that over the last three years we've seen the most dramatic change. Because as you first mentioned, the first use on Reddit was accompanied by a really complex guide how to use it. And you had to have some kind of background understanding of code to use it properly. Typically you had to kind of pack parts together yourself.

Whereas if you fast forward to today, you're absolutely right. There's Deepfacelab, which is the most popular open-source creation tool and people now use that as kind of professional deepfake creators and hobbyists. People who are now working with VFX studios, they have really mastered the art of using that tool. And to create a really high-quality deepfake like some of the ones you've probably seen on YouTube, that does take a lot of time. Not just in terms of expertise, but training the model right.

So for example, the really famous Tom Cruise deepfakes that recently went viral on TikTok, the guy who created those is a guy called Chris Ume, he trained that model for I think two months to get a face swap of that quality. And that was before he then did manual post-production work on the final outputs. So high-quality deepfakes still require a lot of expertise and a lot of time.

GAVIN: Can I ask a tech question Henry? With the Tom Cruise one particularly, he famously used a Tom Cruise impersonator and then put his face over it. From a technical perspective, are we still pretty much in the face only perspective or is there a possibility that deep fakes can go…? Because if I think of Tom Cruise, there's a ton of movies out there that have him doing his thing. Can they do that yet, or are we still ways away from full body deepfake?

HENRY: So full body synthetic media is already here in a very crude form. That's one other thing with deepfakes is like certain forms of deepfakes and synthetic media are more advanced and accessible than others. And body transfer, as it's generally called, there are apps, I think one is called Jiggy, which you can use this to get a picture or a video of Bruno Mars dancing. And then you kind of upload a picture of yourself and it animates your photo dancing like Bruno Mars, which for someone like me is very useful because I can't dance to save my life.

GAVIN: Are you telling me that JibJab was the first deepfake, is that where this is going? JibJab gets the deepfake moniker.

HENRY: The Christmas cards my mom used to make with us as elves ... Yeah, so body post-transfer and synthetic body movements are again, it's coming. It's just a much more kind of nascent stage than say face swapping, which is by far the most predominant form. And face swapping in a cruder form is accessible now by smartphones with one image.

So face swapping in particular is super-accessible now, but at the high quality of any kind of deepfake, it’s still very much the preserve of like VFX experts or people who have learned over years to use these open-source tools very well.

REX: I think my next question is, I'm looking for people who maybe had a similar path of understanding this as me. I think a lot of people, as I understand, the first time they see this, they are kind of blown away too. I was talking to Gavin about it, and I described the process as "WTF to meh," meaning just in a matter of weeks or months, I went from "oh my God, this is terrible for society" to kind of going, "ah, this isn’t really that bad."

Within a very short amount of time I was thinking, this isn't really that different than Photoshop. And if you know the history of creative media, you know that there was lots of freak outs about Photoshop at the time, and to the idea of verity and truth. I guess I'm wondering what your view of it is, are you still in WTF or are you more in meh like me? Like, are you worried? Because there was a lot of scary stuff at first that was proposed and now I find myself not worried.

HENRY: So it's a really interesting framework to look at it from. And I think from my perspective, having kind of being an expert on this for a while and talking to the media about it, at the moment, once every couple of months a really good deepfake will come out, which will scare everyone. They will have that WTF moment and people who perhaps haven't heard of them before will see that first Tom Cruise deepfake and be like “Oh my God, this is so good, it's so easy, and this is going to be the new age of fake news.”

I think the media really capitalized on that quite visceral experience, that very personal experience of being fooled by deepfakes. That kind of thing that a lot of people quite quickly panic about. And in my research, that's one thing that I try to I've tried to do, you know is to delve into what is actually happening now and dispel the hypotheticals of like "well, this might be coming." Actually seeing what are we dealing with right now? What is the real problem? And on that framework, we haven't seen a huge amount of change from the last three years in terms of threat vectors or different kinds of attacks.

Still, the vast majority is non-consensual pornography, which is awful, and we need to do more to combat. It's a problem where thousands of women are being harmed by, but the fears of deepfakes in enhanced disinformation campaigns, fears of deepfakes undermining evidential proceedings in courtrooms. Things like this are still yet to materialize in a meaningful sense.

Having said that, I still think that we do need to be worried about it, I think it’s something we need to be very vigilant about, because I think the Photoshop analogy is a good one, right? Photoshop scared a lot of people, rightly as you said, but now kind of like it's just ubiquitous. It's just a part of our media diet in a way that I think most people don't realize. Like, how many people don't know that the images they see are photoshopped in subtle ways? How much has filtering and that kind of culture impacted us?

I think with Photoshop, like even really crude video edits, like slowing down audio or speeding up video speeds has contributed to disinformation that's gone viral in the US and around the world. And it's caused deaths in India, it's caused massive problems in places like Brazil, France as well has had some really big problems with it. So, I think there's a middle ground we need to take, which is like we need to be aware of the threats that are emerging. We need to be aware of how perhaps we're not quite as infallible as we like to think a lot of the time, but that doesn't mean we should be losing our heads and panicking. It means understanding where we're going and what we need to do before it's a problem to make sure that it doesn't become one.

REX: That's interesting because so much of the stuff I consume in this space is like, "here's a video where every person on the screen is Dr. Phil" or we're going to take that SNL sketch where Trebek and Sean Connery, but we're actually going to face swap them with their actual real faces and make it absurd. It's all this like joke stuff. And I've heard this stat that like somehow the vast majority of deepfakes that are created are revenge porn, effectively.

HENRY: That was my research, yeah, 96%.

REX: Yeah, and I know that that's out there, that it's young dudes who do something bad to their ex-girlfriend like gather up their iPhone library of pictures and make these videos of them. But that's not what trickles over into like the media world, and it's not what ends up on Reddit. It's probably on those dark corners of the internet that you're on, that I’m not a part of.

In the last election, there was this example where a Nancy Pelosi video appeared on Facebook. It was not a case of an actual deepfake, but sort of seemed like one, it was just an effect applied to it. And they just slowed down the recording and slurred her speaking and it made her sound drunk. It had a deepfake-like effect. And I think there's a lot of stuff out there like that. Gavin, didn't you have a question that was related to some of these?

GAVIN: Well, one of the things I wanted to ask you Henry was that I read in a couple of the articles that you had been quoted in about this idea called a "shallow fake." And I really wanted to know what the definition of shallow fake is. Is it just another term for manipulated media that removes the deep learning part of it?

HENRY: Yeah, that's a great question, and a really important part of the puzzle that I think people need to be aware of is that deepfakes fit into the kind of manipulated media heading, but there are other kinds of manipulated media that have come before it.

You know, Photoshop being one of them, but also shallow fakes and shallow fakes more or less refers to a crudely edited piece of audio visual media which uses techniques as I mentioned a minute ago, right? Like yes, slowing down audio speed to make Nancy Pelosi sound drunk in that video, which was then shared by the president at the time.

Or in the case of the Washington Post's White House correspondent, Jim Acosta, speeding up the video to make it look like he violently kind of wrenched his microphone away from the White House intern. Which the White House then used to justify revoking his pass. So those are classic examples of shallow fakes.

Then you've also got just out of context media, there's that famous photo of a shark swimming next to a car in flooded Florida, which gets shared every time there's hurricanes or there's flooding. That's Photoshopped, that's a really crude Photoshop image of that shark. But again, when they’re disasters and people are panicking, or when people aren't thinking critically. You have videos of bombings or you have videos of disasters, which are taken from completely different countries and times.

So shallow fakes really is a broad term to describe those cruder forms of media manipulation that don't, as you said use any deep learning, any sophisticated artificial intelligence. They are very crude, but work really well and require much less effort right now. Which is why, again, some of the fears are that deepfakes are here right now and they're causing really big problems and are overblown. But it's also why we can see why if deepfakes become a sophisticated as it looks like they will, they will be an incredibly powerful tool for these similar bad actors to misuse on a scale we haven't seen before. So yeah, that's kind of the rough framework.

GAVIN: When I hear deepfakes now, I think what Rex was saying before about this idea that deepfakes can be anything from behind the scenes of a entertainment property, revenge porn, or at a scarier level, like the world leader saying something that they're not saying. It sometimes feels like there's a little bit of a bogeyman factor going on with the term deepfake. Do you like the fact that deepfakes is now referred to .... the world at large thinks of it as a very negative term I think? I would assume that that's the case.

HENRY: Yeah again, that's a really good question. I think broadly speaking, my approach to deepfakes and synthetic media, which is a more neutral term to describe the broader uses of the technology... My attitude is trying to inject nuance into this conversation. So there are really awful uses of synthetic media or deepfakes, which we need to be worried about, we need to pay attention to. But also, there are lots of uses of synthetic media which promise to revolutionize our world in the commercial space and potentially even socially beneficial ways.

And this technology is not going away, it's becoming increasingly adopted by big players and is maturing. We need to prepare for a future where synthetic media is playing a really big role. Part of that may well be, as you alluded to, thinking about what we understand deepfake to mean. So some people will think deepfake kind of refers to AI-generated synthetic media that is created with the intent to deceive for malicious purposes.

So as you said, it has got a negative connotation primarily because it evolved exclusively from its use in non-consensual pornography right? That's how the story first broke, that's how people first came to know of it. But then again, there are also people who are like yeah, I'm a deepfake artist, this is what I do, and they've kind of embraced it. And they're trying to use the term in a more neutral sense.

The trend I see again, talking to the people who are looking to commercialize this technology for the entertainment sector, or looking to use this technology in advertising and things like this, they tend towards synthetic media precisely because it doesn't have that negative connotation attached. But you’re absolute right, deepfakes have become a boogeyman, that a lot of people are really worried about without necessarily understanding the full nuance of what it refers to. And while we do need to be concerned and we should worry about certain use-cases, there is a much broader change in the zeitgeist almost of media going on here, which needs a much more balanced and broader conversation to fully understand its significance.

REX: On that same theme, I'm curious if you have any theories about why we never had the dystopian moment. Instantly, you can think of the worst possible thing happening, especially around the 2020 election cycle. There was a lot of hype around, something terrible is going to spread on Facebook and it's going to change the course of the election, and we’re going to blame deepfakes for electing the wrong person and that never happened.

And you've done a good job in realigning my thinking that there is a bunch of bad stuff that still happens, and you should still be aware of that. But at the same time, the worst possible stuff never happened, not even like a second tier stuff I'd say never really happened. And I'm curious if you have theories as to why it hasn't happened. Is it that the videos aren’t good enough, is it that people aren't malicious as we thought they were, or is it some other reason?

HENRY: I think the key thing for me here would be that it hasn't happened yet. Like I mean, the timeframe that people are continually saying is the next election, the next event, and people will continue to say that until it happens.

I remember in the midterms, people were saying deepfakes are going to be a big problem here and they weren't. And in general, we haven't seen at least a confirmed case of deepfakes in a political context around the world being used with devastating effect. As you said, kind of making someone appear that they've said something they haven't, or putting them in a place that they weren't. It hasn't materialized yet that I'm aware of.

We have seen like hints of it though. So, for example, before the midterms, Russian interference used the idea of deepfakes and a fake Marco Rubio quote to say the Democrats were spreading deepfakes of Republican candidates. We’ve also seen with the dossier that was apparently leaked from an intelligence officer for Hunter Biden in this election. The apparent agent that leaked these documents was using a StyleGAN image, and that's an image which is entirely synthetically generated of a person who doesn't exist.

So we're seeing hints of how synthetic media and deepfakes are kind of entering into political sphere. But I think the biggest impact that we've seen is how just the idea alone is causing a massive destabilization of political discourse around the world. And specific to the US, I would say we are seeing increasingly on places like 4chan and 8kun. You know, QAnon people saying, "Trump saying I've conceded" or "Trump saying that the rioters at the Capitol were wrong to what they did" ... And they'll say, "No, that can't be real, that must be a deepfake."

REX: I think that’s really interesting that in some ways the thing that it's done to society that's worse than anything else is that it's made real documents, real media seem more suspicious. The larger societal effect is something about a larger sense of distrust about things that are real, but people think could have been manipulated.

HENRY: Absolutely. What deepfakes have done, and it's kind of like a weird perverse relationship. Which is the more awareness that's raised of deepfakes, which is kind of important, right, the people know the technology is out there, the more it kind of poisons the well of what people think is now possible and how that impacts their perception of authentic media. Particularly if there are cognitive biases that predispose them to want to believe a certain piece of authentic media isn’t real.

So you have deepfakes not just making fake things look real, but also providing a plausible way to dismiss real things as fake. And just an example, this happened several times around the world, but particularly in kind of less economically developed countries. But with the current situation going on in Myanmar at the moment, there was a confession that was released of one of the ministers saying that he colluded on crimes with the President or the Prime Minister. And this video was seized on by protestors saying, "This is a deepfake, this isn't real, this is not him actually confessing. This is created by the military to try and incriminate him."

And I did an analysis on this, as did some of my friends and colleagues who were specialists and we found no evidence of manipulation. It's not impossible that it happened, but if it did, it must've been an incredibly good job. But the narrative had spread that it was fake and people had just assumed and believed deeply that this was not real, because he could not have said that, and he must have been faked in order to get that confession.

And so that narrative has happened in Gabon in Africa where there was an attempted coup because people didn't believe that the president was alive anymore, and that the video released of him to confirm health was a deepfake. That's the biggest problem in politics we're seeing with deepfakes right now. And I think these uses will come, and I think your other question, why hasn't it happened yet?

I think you touched on it yourself, like realism is not sufficiently goods to kind of pass, I think, convincingly without a lot of work and expertise. And also let's say if a state actor were to try and do this, they'd likely have to create their own sort of model to do it well. And as soon as that model is released in the wild, then people will understand how that model works and be able to pick up on the fingerprints that leaves behind on the deepfakes it generates.

GAVIN: So one of the things I watched recently was that Netflix documentary about art fraud, you know what I’m talking about? Obviously, seeing if something is real is the most important thing in the world in this situation, what are fingerprints that do get left by the materials now so that you can prove that they're fake? Is there a specific thing you look for?

HENRY: Yeah, so there are some, and again, it depends on the kind of media you're looking at, right? Like obviously with certain images, using certain generators, that will leave behind different things to say a moving face swap on a video. But some key ones you can look for are things like mapping of the face when the face or the head is moving in different dimensions. So typically, face swapping models are really bad at getting faces if you're like looking down or you're looking up, or you're turning from side to side. Same with things like occlusion of the facial area. So if someone is smoking a cigarette or taking off their glasses or something like this, typically that's where you'll get flickering of the face that was originally kind of underneath. Even though there isn't a face underneath; it is a composite thing.

So those are some that you can get on videos, same with things like blending of the skin tones can be quite a good telltale sign, as can things like with the synthetic images, these StyleGAN images, you get visual artifacts, these are kind of things that shouldn't be there, especially when they're trying to recreate things like jewelry, like earrings kind of turned into these weird swirly things. Same again with strands of hair, you get these kinds of weird swirly patterns.

So there are some telltale signs, but the problem is, someone may listen to this podcast and hear these telltale signs and come away from it saying, "I now know what to look for." And in six months, those things may have been trained out of these models and they may have improved to the point that they're no longer there. But then it's not going to be necessarily a timely podcast reminder to them that suddenly these things are no longer in these images or videos. So I'm really hesitant to kind of say with confidence, if you don't see these, it's not a deepfake.

Because that gives false confidence, and indeed we've seen how false confidence in deepfake detection tools with this Myanmar case I just mentioned, these tools, I'm pretty sure, got it wrong. And they said confidently, this is a deepfake, which is really dangerous in a political crisis like that. Being able to having people falsely having confidence that something is real or fake could be the difference between an attempted coup or a election being declared void. So it’s a really difficult problem. There are some signs now, but there's no guarantee they'll be there in six months, in a year, or maybe even sooner.

REX: I'm curious, has anyone taken some of the best examples of Hollywood? Like the example I cited earlier was The Irishman with De Niro, Pacino in it, and the Netflix movie that won a bunch of awards, had a ton of de-aging in it ... and used deepfake-like technology. Has anyone on the research side taken things like that and found out if they can detect the seams of manipulation in it? You know what I mean? That's an example of the best high-tech applied to deepfakes and the most deceptive, and I'm curious if like researchers are even are able to identify something that good.

HENRY: As in, are researchers able to detect, in an automatic sense, the manipulation in The Irishman?

REX: Yeah, can they see the artifacts of manipulation within that?

HENRY: Within the original version that was released on Netflix?

REX: Yes.

HENRY: So that's an interesting case, The Irishman because a lot of the VFX community and a lot of the deepfake community thought they did a really bad job.

GAVIN: [laughs] Really?

HENRY: Yeah, there was actually a couple of videos out there of people who have tried to recreate scenes from The Irishman using face swap of younger footage from when De Niro was younger to do a better job than the synthetic de-aging that these manual click-by-click VFX artists have done. That's been done with Star Wars as well with Princess Leia, with Carrie Fisher who was put in with Grand Moff Tarkin and … I can't remember the name of the actor. But it's been done quite a few times by deepfake artists where they've said actually "Yeah, this isn't that good. I can do this better on a gaming PC in my mom's basement."

REX: Yeah that's interesting because Gavin shared with me a video of like this is a whole sub-genre within YouTube of graphic artists people who do this work in Hollywood, taking old movies and making the effects better. What's the one you shared with me Gavin?

GAVIN: It was The Scorpion King, they redid The Rock's face as he came out. You know what I’m talking about?

HENRY: I know the scene you're talking about, and the original hasn’t aged well.

GAVIN: It's so bad.

[clip from Corridor Crew video]

GAVIN: But they did a great job with these guys, this channel basically, their VFX artists, and they go in and they take the original scene, which got panned hugely, and they used deepfake technology to change it. It’s incredibly better. And this is four or five guys and not an unpowerful computer, but this is not ILM, this is a pretty small working. And they're not working from the original files either, they're just literally slapping on this thing on top of the actual master.

REX: It’s interesting as the whole genre of this out there that like people are oh, I can do better than Hollywood can, and I can improve this.

HENRY: I don't know specifically the channel you're referring to, I imagine it might be Corridor Digital who have done some amazing work with deepfakes. They are VFX specialists, and they kind of made some deepfakes of Keanu Reeves. They did an original one of Tom Cruise, which wasn't as good as the more recent versions, but yeah, absolutely. I know a lot of the people in the community who can make deepfakes professionally.

They started off doing this is kind of like a fun hobby on YouTube, and they kind of like oh, a hundred thousand views on this video, that's great. They were just kind of doing it for fun. Now these people are being consulted or they're consulting or providing their work to massive studios. Like, the South Park creators hired a load of deepfake artists for an entirely new studio, for an entirely new deepfake satire show they've created, which they said was the first video released on YouTube. They said must've been the most expensive YouTube video ever made, like these people are artists. And there are a lot of people who are really good at using this technology who use it for awful purposes with non-consensual deepfake pornography or image abuse.

But what I meant earlier when I said this technology is maturing, there is a respected growing number of people who recognize the creative potential of deepfakes and synthetic media, who as you said have de-aged people in old films or have colorized old films using similar kinds of technologies. We have seen a new film featuring James Dean, a synthetic version of James Dean has been commissioned, that's going to be coming out. A French soap opera and actress got COVID, no problem, you deepfake her into the scene. Snoop Dog records an advert for the company JustEat, which I guess is like Grubhub I think you guys have in the States or something, you know, like takeaway service. He records the advert once, but they want to repurpose it for another region of the world? No problem. You just hire a company to synthetically change his lip movements to match a new audio track.

This is truly going to revolutionize all of our content creation and content consumption. And there are some really interesting questions about the ethically ambiguous use cases, right? Like, not just the really explicitly malicious ones like disinformation or non-consensual deepfake pornography, but the use cases where you are synthetically resurrecting someone, and you have to ask questions like, do I need consent to do this? Can I meaningfully consent to my likeness being replicated? In what cases don't you need consent?

REX: You got into my third big question without even me getting to ask it. It really was about the future of this for entertainment. I had this term that I like to use in these conversations where I corner people at parties and show them this deepfake, which happens a lot. I call it the "eternal celebrity." It seems really interesting to me that this technology has emerged at the same time there's a huge transition going on within Hollywood, within entertainment, towards the franchising of everything. Like everything now is a reboot or cinematic universe. You went through a good list of all of the cases that are happening, and I think people overlook how much is actually happening right now.

And that James Dean thing is like a really good example of, I think that this is coming, I think we're going to start taking old dead people and recreating them. And I think within 20 years, we're going to have somebody win an Oscar who is dead, that we've digitally recreated. That's my theory, I take the over/under on 20. So I guess my thing is do you think that this is accurate? And I'm curious if you know anything about some of the legal stuff.

I think this is true, I think Robin Williams in his will put in that he cannot have his likeness reused by anyone for any kind of future … He saw the deepfake stuff coming, and said he didn’t want to use it. But I'm curious, is that legally binding in 20 years of his kids want to do it? It's like, I don't know you could stop them, and so once you're dead, I think it's going to be open game. I don't know. But even now, there's going to be a ton of actors who get too old to play the person they played 20 years ago, and they're just going to be deepfaked into entertainment. And we're just going to keep recycling the same characters over and over again. Does this sound like the future of creative media to you?

HENRY: This is something that I've been thinking about a lot, these kinds of gray areas with synthetic media. I think synthetic resurrection, or techromancy, which I quite like as another way of talking about it, it's one of the most interesting areas because as you've kind of identified some of the legal tension, somebody's really open question, and kind of fundamentally, again, this is about film stars that we love and we know, and what we think we know anyway. We have this kind of weird parasocial relationship with.

These are people who have formed part of our childhoods and films that we love and adore. And there are some really big questions, as you just alluded to, as to whether we want this to happen. Like do we want Scarlett Johansson to feature in a film in 50 years? And it's not real, like how much of that is actually her, does she still contribute in some way? Is it like her body, but with lie a de-aged face? Is it a completely new actor? If it's a completely new actor, do they get some credit for that performance? How do we determine who is actually the person? For example, with deepfakes, one of the really impressive things about face swapping is that it maps the target's facial expressions perfectly.

So the person who would be having to be the body double, I guess you could say or the face double even, would have to be doing a really good job of the expressions and all of the parts of acting. But at the same time, people like SAG, the Screen Actors Guild, are going to be really wanting to protect the performance rights and the way that if your face is used... You know, that's the performance that you're giving, you deserve full payment for that performance.

GAVIN: Maybe there is a world where it's going to be Andy Serkis and Scarlett Johansson, do you know what I mean? Because Andy Serkis is famously Gollum. But there may be an award that's given to two people. One, the person who's doing the actual motion capture or the actual acting and the other person who's like a posthumous, James Dean academy award, but then having like Brad Pitt playing that character. And obviously, Brad Pitt in his own, he's not going to let his face get swapped out by James Dean, necessarily. But like that seems like a world where you could have a shared award in some ways, too.

I know we have wrap up pretty soon here because Henry, I know you have to move on to something else, but hey, thank you so much to both Henry and Rex. Henry, I do want to ask you before you go, I'm going to try to do this with all of our guests, what is something that you're way too interested in right now? Is there a specific thing that you can't get out of your brain?

HENRY: Yeah, so I'm only 27, but I'm having a late in life, for me, a discovery of jazz, specifically in the UK here in South London. There's an exploding jazz scene of these really exciting new young jazz musicians who are doing some really interesting stuff. And I haven't listened to a huge amount of the classics — Miles Davis, Herbie Hancock — but these young guys are doing something that taps into kind of a fascination that I didn't know I had with this music. I normally like music that's a bit more emotional for me, whereas jazz technically just blows my mind. So yeah, I'm getting deep into the jazz scene and I hope that I would get my way back to the classics someday soon.

GAVIN: Do you have a specific artist that we can check out?

HENRY: I would really recommend checking out a guy called Yussef Dayes, he has, I think the album is called Welcome to the Hills, I think. He also did the collaboration last year with an artist called Tom Misch, the album is called What Kind of Music. Both just exceptional records, real good energy, like when you need a bit of a perking up in lockdown summer, that's what I've been using.

GAVIN: Amazing, that is fantastic. Well thanks again to both of you for being here for the original recording. Hopefully, Rex, we got most of your questions answered and I'm going to go learn a whole shitload more about this now because I can't believe the stuff I didn't know about it. So thanks again everybody, we really appreciate it, and come back for the next episode.

HENRY: My pleasure, thanks guys.

REX: Thanks, see ya!

GAVIN: Well that's our show for this week, thank you so much for coming and listening. Way Too Interested is produced mostly entirely by me, but I did have some good help along the way. Eric Johnson of LightningPod has helped me significantly in getting the show up and running, does the editing on my interviews and overall is a great human being. You should definitely go and find LighteningPod if you are trying to start a podcast or you need some help making one.

Other than that, the Gregory Brothers, thank you for making the theme song, this music you hear right now, which I specifically asked for some very easy listening sort of style for the credits. And of course, thank you to my guests this week, Rex Sorgatz and Henry Ajder, we had a great time talking deepfakes, and I will see you next week.

Also go to waytoointerested.com, our website. I want to hear from you if you have a problem with the show, if you think it's interesting, I hope you think it's interesting, or if you have subjects that you're fascinated by. I'm also toying around with the idea of starting a Discord, I would love to start a conversation with you.

I'm on Twitter at twitter.com/gavinpurcell; pretty active there. Would love to talk more with anybody that listened and kind of get some feedback on the show. It's my first time making one of these, so I'm still in the learning stages, but I'm having a good time and I hope to keep it up. Thank you again for listening and please come back again.

Recent episodes:

Made on
Tilda