7.7 C
New York
Saturday, November 16, 2024

What Occurred After I Cloned My Personal Voice


Lately my colleague Charlie Warzel, who covers expertise, launched me to the most refined voice-cloning software program obtainable. It had already been used to clone President Joe Biden’s voice to create a faux robocall discouraging folks from voting within the New Hampshire major. I signed up and fed it a number of hours of me talking on numerous podcasts, and waited for the Hanna Rosin clone to be born. The way in which it really works is you kind a sentence right into a field. For instance, Please give me your Social Safety quantity, or Jojo Siwa has such nice vogue!, after which your manufactured voice, created from samples of your precise voice, says the sentence again to you. You may make your self say something, and shift the depth of the intonation till it sounds uncannily such as you.

Warzel visited the small firm that made the software program, and what he discovered was a well-recognized Silicon Valley story. The folks at this firm are dreamers, impressed by the Babel fish, a fictional translation system, from The Hitchhiker’s Information to the Galaxy. They think about a world the place folks can communicate to 1 one other throughout languages and nonetheless sound like themselves. In addition they could not be capable of put the genie again within the bottle if (or when) the expertise results in world-altering chaos, significantly on this coming yr, when greater than half of the world’s inhabitants will endure an election.

On this episode of Radio Atlantic, Warzel and I talk about how this small firm perfected the cloned voice, and what good and unhealthy actors would possibly do with it. Warzel and I spoke at a reside present in Seattle, which allowed us to play a number of tips with the viewers.

Hearken to the dialog right here:


The next is a transcript of the episode:

Hanna Rosin: So a number of weeks in the past, my colleague employees author Charlie Warzel launched me to one thing that’s both wonderful or sinister—in all probability each.

Charlie’s been on the present earlier than. He writes about expertise. And most lately, he wrote about AI voice software program. And I’ve to say: It’s uncannily good. I signed up for it—uploaded my voice—and man does it sound like me.

So, in fact, what instantly occurred to me was all of the completely different flavors of chaos this might trigger in our future.

I’m Hanna Rosin. That is Radio Atlantic. And this previous weekend, I used to be in Seattle, Washington, for the Cascade PBS Concepts Pageant. It’s a gathering of journalists and creators and we mentioned subjects from homelessness, to the Supreme Courtroom, to the obsession with true crime.

Charlie and I talked about this new voice software program. And we tried to see if the AI voices would idiot the viewers.

For this week’s episode, we deliver you a reside taping with me and Charlie. Right here’s our dialog.

[Applause]

Rosin: So in the present day we’re going to speak about AI. We’re all conscious that there’s this factor barreling in direction of us referred to as AI that’s going to result in enormous modifications in our world. You’ve in all probability heard one thing, seen one thing about deep fakes. After which the subsequent massive phrase I need to put within the room is election interference.

At this time, we’re going to attach the dots between these three massive concepts and convey them a little bit nearer to us as a result of there are two vital truths that that you must find out about this coming yr. One is that this can be very simple—by which I imply ten-dollars-a-month simple—to clone your personal voice, and probably anyone’s voice, properly sufficient to idiot your mom. Now, why do I do know this? As a result of I cloned my voice, and I fooled my mom. And I additionally fooled my associate, and I fooled my son. You’ll be able to clone your voice so properly now that it actually, actually, actually sounds lots such as you or the opposite individual. And the second proven fact that it’s vital to find out about this yr is that about half the world’s inhabitants is about to endure an election.

So these two information collectively can result in some chaos. And that’s one thing Charlie’s been following for some time. Now, we’ve already had our first style of AI-voice election chaos. That got here within the Democratic major. Charlie, inform us what occurred there.

Charlie Warzel: A bunch of New Hampshire voters—I believe it was about 5,000 folks—obtained a telephone name, and it will say “robocall” whenever you decide it up, which is customary should you reside in a state doing a major. And the voice on the opposite finish of the road was this type of grainy-but-real-sounding voice of Joe Biden urging folks to not exit and vote within the major that was arising on Tuesday.

Rosin: Let’s, earlier than we preserve speaking about it, take heed to the robocall. Okay? We’re going to play it.

Joe Biden (AI): Republicans have been attempting to push nonpartisan and Democratic voters to take part of their major. What a bunch of malarkey. We all know the worth of voting Democratic when our votes depend. It’s vital that you just save your vote for the November election. We’ll want your assist in electing Democrats up and down the ticket. Voting this Tuesday solely allows the Republicans of their quest to elect Donald Trump once more. Your vote makes a distinction in November, not this Tuesday.

Rosin: I’m feeling like a few of you might be doubtful, like that doesn’t sound like Joe Biden. Clap should you suppose that doesn’t sound like Joe Biden.

[Small amount of clapping]

Rosin: Effectively, okay. Someplace in there. So whenever you heard that decision, did you suppose, Uh-oh. Right here it comes? Like, what was the lesson you took from that decision? Or did you suppose, Oh, this obtained solved in a second and so we don’t have to fret about it?

Warzel: After I noticed this, I used to be truly reporting out a characteristic for The Atlantic in regards to the firm ElevenLabs, whose expertise was used to make that telephone name. So it was very resonant for me.

You realize, once I began writing—I’ve been writing about deep fakes and issues like that for fairly some time (I imply, in web time), since 2017. However there’s all the time been this sense of, you understand, What’s the precise stage of concern that I ought to have right here? Like, What’s theoretical? With expertise and particularly with misinformation stuff, we are likely to, you understand, discuss and freak out in regards to the theoretical a lot that typically we’re probably not speaking about and considering, grounding it in plausibility.

So with this, I used to be truly attempting to get a way of: Is that this one thing that will even have any actual sway within the major? Like, did folks imagine it? Proper? It’s form of what you simply requested the viewers, which is: Is that this believable? And I believe whenever you’re sitting right here, listening to this with hindsight, and, you understand, attempting to guage, that’s one factor.

Are you actually gonna query, like, at this second in time, should you’re getting that, particularly should you aren’t paying shut consideration to expertise—are you actually gonna be eager about that? This software program continues to be figuring out a number of the kinks, however I believe the believability has crossed this threshold that’s alarming.

Rosin: So simply to present these guys a way, what can it do now? Like, we heard a robocall. May it give a State of the Union speech? May it discuss to your spouse? What are the issues that it might probably do now that it’s made this leap that it couldn’t do a number of months in the past, convincingly?

Warzel: Effectively, the convincing half is the largest a part of it, however the different a part of these fashions is the power to ingest extra characters and throw it on the market. So this firm, ElevenLabs, has a stage you could pay for the place you may—should you’re an creator, you may throw your entire novel in there, and it might probably do it in a matter of minutes, primarily, after which you may undergo and you may tweak it. It may positively do an entire State of the Union. Primarily, it’s given anybody who’s obtained 20 bucks a month the power to take something that they need to do content-wise and have it come out of their voice.

So lots of people that I do know who’re unbiased journalists or authors or folks like which might be doing all of their weblog posts, their e mail newsletters as podcasts—but in addition as YouTube movies, as a result of they hook this expertise, the voice AI, into one of many video or picture mills, so it generates a picture on YouTube each few paragraphs and retains folks hooked in.

So it’s this concept of: I’m not a author, proper? I’m a content material human.

Rosin: I’m a multi-platform human. Okay. That sounds—you fill within the adjective.

Warzel: Yeah, it’s intense.

Rosin: Okay, so Charlie went to go to the corporate that has introduced us right here. And it’s actually fascinating to take a look at them as a result of they didn’t got down to clone Joe Biden’s voice. They didn’t set out, clearly—no one units out to run faux robocalls. So getting behind that fortress and studying, like, Who’re these folks? What do they need? was an fascinating journey.

So it’s referred to as ElevenLabs—and, by the way in which, The Atlantic, I’ll say, makes use of ElevenLabs to learn out some articles in our journal, so simply so you understand that. A disclaimer.

I used to be actually shocked to study that it was a small firm. Like, I might anticipate that it was Google who crossed this threshold however not this small firm in London. How did that occur?

Warzel: So one of the crucial fascinating issues I discovered once I was there—I used to be interested by them as a result of they had been small and since they’d produced this tech that’s, I believe, higher than everybody else.

There are a number of corporations: Meta has one which they haven’t launched to the general public, and OpenAI additionally has one which they’ve launched to sure choose customers—partly as a result of they aren’t fairly positive management it, essentially, from being abused. However that apart, ElevenLabs is kind of good. They’re fairly small.

What I discovered once I was there speaking to them is that they talked about their engineering staff. Their engineering staff is seven folks.

Rosin: Seven?

Warzel: Yeah, so it’s, like, former—that is the engineering analysis staff. It’s this small, little staff, they usually describe them virtually as, like, these brains in a tank that will simply—they might say, Hey, you understand, what we actually need to do is we need to create a dubbing a part of our expertise, the place you may feed it video of a film in, you understand, Chinese language, and it’ll simply form of, virtually in actual time working it via the expertise, dub it out in English or, you understand, you title the language.

Rosin: Is that as a result of dubbing is traditionally tragic?

Warzel: It’s fairly unhealthy. It’s fairly flat in loads of locations. Clearly, should you reside in a few the massive markets, you will get some good voice appearing within the dubbing. However in Poland, the place these guys are from, it’s all dubbed in a very flat—they’re referred to as lektors. That’s the title for it. However, like, when The Actual Housewives was dubbed into Poland, it was one male voice that simply spoke like this for all the actual housewives.

Rosin: Oh, my God. That’s wonderful.

Warzel: In order that’s instance of, like, this isn’t good. And so folks, you understand, watching U.S. cinema or TV in Poland is, like, type of a grinding, horrible expertise. So that they wished to alter issues like that.

Rosin: For some purpose, I’m caught on this, and I’m imagining RuPaul being dubbed in a very flat, accentless, like, sashay away. You realize?

Warzel: Completely. So that is truly one of many issues that they initially had been getting down to resolve, this firm. And so they type of, not lucked into, however discovered the remainder of the voice-cloning stuff in that area. They speak about this analysis staff as these brains within the tank. And so they’ll simply be like, Effectively, now the mannequin does this. Now the mannequin laughs like a human being. Like, Final week it didn’t.

And once more, whenever you attempt to discuss to them about what we did, it’s not like pushing a button, proper? Then they’re like, It’s too sophisticated to essentially describe. However they’ll simply say that it’s this small group of people who find themselves, primarily—the rationale the expertise is nice or does issues that different folks’s can’t do is as a result of they’d an thought, a tutorial thought, that they put into the mannequin, had the numbers crunch, and this got here out.

And that, to me, was type of staggering as a result of what it confirmed me was that with synthetic intelligence—in contrast to, you understand, one thing like social networking the place you simply obtained to get an enormous mass of individuals linked, proper? It’s community results. However with these items, it truly is like Quantum Leap–fashion laptop science. And, you understand, clearly, cash is nice. Clearly, compute is nice. However a really small group of individuals can toss something out into the world that’s extremely highly effective.

And I believe that could be a actual revelation that I had from that.

[Music]

Rosin: We’re going to take a brief break. And once we come again, Charlie explains what the founders of ElevenLabs hope their expertise will accomplish.

[Music]

Rosin: So these guys, like loads of founders, they didn’t got down to disrupt the election. They in all probability have a dream. In addition to simply higher dubbing, what’s their dream? Once they’re sitting round and also you get to enter their mind area, what’s the magical way forward for many languages that they envision?

Warzel: The total dream is, principally, breaking down the partitions of translation utterly. Proper? So there’s this well-known science-fiction e-book Hitchhiker’s Information to the Galaxy, the place there’s this factor referred to as the Babel fish that may translate any language seamlessly in actual time, so anybody can perceive everybody.

That’s what they finally need to make. They need to have this—you understand, dubbing has a little bit little bit of latency now, but it surely’s getting sooner. That plus all of the completely different, you understand, voices. And what they primarily need to do is create a instrument on the finish, down the road, you could put an AirPod in your ear, and you may go anyplace, and everybody else has an AirPod of their ear, and also you’re speaking, and so you may hear all the pieces instantly in no matter language. That’s the top objective.

Rosin: So the attractive dream, should you simply take the purest model of it, is all peoples of the world will be capable of talk with one another.

Warzel: Yeah. After I began speaking to them—as a result of, dwelling in America, I’ve a unique expertise than, you understand. Most of them are European, or lots of them—the 2 founders are European. You realize, they mentioned, You develop up, and it’s a must to study English at school, proper?

There’s only some locations the place you don’t develop up and, they are saying, you additionally gotta study English if you wish to go to college wherever, do no matter, and take part on the planet. And so they mentioned, If we do that, then you definitely don’t have to do this anymore.

Rosin: Ooh, there goes our hegemony.

Warzel: Think about the time you’ll save, of not having to study this different language.

Rosin: So that they’re eager about Babel and this lovely dream, and we’re considering, like, Oh, my god, who’s gonna rip-off my grandmother, and who’s gonna mess up my election?

Do they consider that? Did you discuss to them about that? Like, how conscious are they of the potential chaos coming down?

Warzel: They’re very conscious. I imply, I’ve handled loads of, in my profession, tech executives who’re form of—they’re not prepared to essentially entertain the query. Or in the event that they do, it’s type of glib, or there’s a little bit little bit of resentment, you may inform. They had been very—and I believe due to their age (the CEO is 29)—very earnest about it. They care lots. They clearly have a look at all this and see—they’re not blinded by the chance, however the alternative looms so giant that these unfavorable externalities are simply issues they are going to resolve, or that they’ll resolve.

And so we had this dialog, the place I referred to as it “the unhealthy issues,” proper? And I simply stored, like: What are you going to do about jobs this takes away? What are you going to do about all this misinformation stuff? What are you going to do about scams? And so they have these concepts, like digitally watermarking all voices and dealing with all types of various corporations to construct a watermarking coalition so whenever you voice file one thing in your telephone, that has its personal metadata that claims, like, This got here from Charlie’s telephone on this time.

Rosin: Uh-huh.

Warzel: You realize, like, That is actual. Or whenever you put up the ElevenLabs factor, it says—and folks can shortly decode it, proper? So there’s all these concepts.

However I can inform you, it was like smashing my head in opposition to a brick wall for an hour and a half with this actually earnest, good one who’s like, Yeah. No, no. It’s gonna take some time earlier than we, you understand, societally all get used to all these completely different instruments, not simply ElevenLabs.

And I used to be like, And within the meantime? And they’d by no means say it this fashion, however the vibe is form of like, Effectively, you gotta break loads of eggs to get the, you understand, universal-translation omelet scenario. However you understand, a few of these eggs is likely to be just like the 2024 election. It’s an enormous egg.

Rosin: Proper, proper, proper. So it’s the acquainted story however extra earnest and extra self-aware.

Do you guys need to do one other check? Okay. You’ve been listening to me discuss for some time. Charlie and I each fed our voices into the system. We’re gonna play to you me saying the identical factor twice. One in every of them is me, recorded. I simply recorded it—me, the human being, within the flesh proper right here. And one among them is my AI avatar saying this factor. There’s solely two. I’m saying the identical factor. So we’re gonna vote on the finish for which one is fake-AI Hanna. Okay, let’s play the 2 Hannas.

Rosin (Actual): Charlie, how far do you suppose synthetic intelligence is from with the ability to spit out 1,000,000 warrior robots programmed to destroy humanity?

Rosin (AI): Charlie, how far do you suppose synthetic intelligence is from with the ability to spit out 1,000,000 warrior robots programmed to destroy humanity?

Rosin: Okay, who thinks that primary is faux Hanna?

[Audience claps]

Rosin: Who thinks that quantity two is faux Hanna?

[Audience claps]

Warzel: It’s fairly even.

Rosin: It’s fairly even. I might say two is extra sturdy, and two is right—that’s the faux one.

Warzel: I’m zero for 2.

Rosin: However man, it’s shut. Like, Charlie frolicked at this place, and he’s gotten each of them flawed thus far.

Warzel: We work collectively!

Rosin: We work collectively. That is actually, actually shut.

Warzel: You realize, the one, like, bulwark proper now in opposition to these items is that I do suppose persons are, usually, fairly doubtful now of most issues. Like, I do suppose there may be only a common suspicion of stuff that occurs on-line. And I additionally suppose that one factor now we have seen from a few of these is—there’s been a few ransom calls, proper? Such as you get a—it’s a rip-off but it surely’s your mother’s voice, proper? Or one thing like that.

These issues form of come down the road fairly shortly. Like, you may fairly shortly notice that your mother isn’t being kidnapped. You’ll be able to fairly shortly, as directors, you may unravel that. Mainly, I don’t understand how efficient these items are but, due to the human aspect. Proper? It looks as if now we have a little bit bit extra of a protection now than we did, you understand, let’s say, in 2016.

And I do suppose that point is our best asset right here. With all of this, the issue is, you understand, it solely takes one, proper? It solely takes some individual, you understand, in late October, who places out one thing simply adequate, or early November, that it’s the very last thing somebody sees earlier than they go to the polls, proper?

And it’s too onerous to debunk, or that individual doesn’t see the debunking, proper? And so, these are the issues that make you nervous. But in addition, I don’t suppose but that we’re coping with godlike potential to simply completely destroy actuality.

It’s form of someplace within the center, which continues to be, you understand, nerve-wracking.

Rosin: So the hazard state of affairs is a skinny margin, very strategic use of this expertise. Like, less-informed voters, a suppress-the-vote—someplace the place you possibly can use it in small, strategic methods. That’s a practical concern.

Warzel: Yeah, like, hyper-targeted not directly.

I imply, it’s humorous. I’ve talked to a few AI specialists and folks within the area of this, they usually’re so frightened about it. It’s actually onerous to coax out nightmare situations from them. They’re like, No, I’ve obtained mine. And I’m completely not telling a journalist. Like, no means. I don’t need this printed. I don’t need anybody to find out about it. However I do suppose—and this could possibly be the truth that they’re too near one thing, or it could possibly be that they’re proper, and they’re actually near it. However there’s a lot concern from individuals who work with these instruments. I’m not speaking in regards to the ElevenLabs folks, essentially.

Rosin: However AI folks.

Warzel: However AI folks. I imply, true believers within the sense of, you understand, If it doesn’t occur this time round, properly, wait ’til you see what it’s going to be in 4 years.

Rosin: I do know. That actually worries me, that the folks inside are so frightened about it. It’s like they’ve birthed a monster type of vibe.

Warzel: It’s additionally good advertising. You’ll be able to travel on this, proper? Like the entire thought of, you understand, We’re constructing the Terminator. We’re constructing Skynet. It may finish humanity. Like, there’s no higher advertising than like, We’re creating the potential apocalypse. Listen.

Rosin: Proper. All proper. I’m going to inform you my two fears, and also you inform me how real looking they’re. One is absolutely the perfection of scams, designed to focus on older people who find themselves barely dropping their recollections, which might be already fairly good. Like, they’re already fairly good, and also you already hear so many tales of individuals dropping some huge cash. That’s one I’m frightened about. Like, how simple it’s to persistently name somebody within the voice of a grandson, or within the voice of no matter. That one looks as if an issue.

Warzel: Yeah, I believe it is going to be, and I don’t suppose it needs to be relegated to people who find themselves so previous they’re dropping their recollections. It’s troublesome to discern these items. And, I believe, what I’ve discovered from loads of time reporting on the web is that no one is resistant to a rip-off.

Rosin: Sure.

Warzel: There’s a rip-off ready to match with you. And, you understand, whenever you discover your counterpoint, it’s—

Rosin: It’s like real love.

Warzel: Precisely.

Rosin: Out there may be the proper rip-off for you. Okay, another fear after which we’re going to do our final check.

My actual fear is that individuals will know that issues are faux, but it surely received’t matter, as a result of persons are so hooked up to no matter narrative they’ve that it received’t matter to them should you show one thing is actual or faux.

Like, you may think about that Trump would put out a factor that was faux and everyone would type of understand it’s faux, however everybody would collude and resolve that it’s actual, and proceed primarily based on that. Like, actual and faux simply—it’s not a line folks fear about anymore, so it doesn’t matter.

Warzel: I totally suppose we reside in that world proper now. I imply, actually.

I believe instance is loads of the stuff, not solely the stuff that you just see popping out of the Center East in the way in which that—I imply, clearly there’s a lot literal digital propaganda and misinformation coming from completely different locations, but in addition simply from the conventional stuff that we see. And this can be a little much less AI-involved, however I believe there’s simply lots of people, particularly youthful folks, who simply don’t belief the institution media to do the factor. And so they’re like, Oh, I’m gonna watch this, and I don’t actually care. And so I believe the extent of mistrust is so excessive for the time being that we’re already in that scenario.

Rosin: Like we’re of a technology, and we’re journalists, and so we sit and fear about what’s actual and what’s faux, however that’s not truly the road that persons are listening to on the market.

Warzel: Yeah. I believe the actual factor is, like, getting to a degree the place you’ve got constructed sufficient of a para-social belief relationship with somebody that they’re simply gonna imagine what you say after which attempt to be accountable about it, about delivering them info, which is loopy.

Rosin: Okay. One last fake-voice trick. This one’s on me since, Charlie, you had been flawed each occasions. Now it’s my flip.

My producers wished to present me the expertise of understanding what it’s wish to have your voice saying one thing that you just didn’t say. So that they took my account, they’d my voice say issues, and I haven’t heard it, and I don’t know what it’s. So we’re going to take heed to that now. Will probably be a shock for all of us, together with me. So let’s hear to those faux voicemails created by my fantastic producers.

Rosin (AI): Hello! I’m calling to depart a message about after-school pickup for my children. Simply wished to let their homeroom trainer know that Zeke within the white van is an expensive household good friend, and he’ll be selecting them up in the present day.

Rosin: (Laughs.) Okay.

Rosin (AI): Hello, mother. I’m calling from jail, and I can’t discuss lengthy. I’ve solely obtained one telephone name. I actually need you to ship bail cash as quickly as you may. I want about $10,000. Money App, Venmo, or Bitcoin all work.

Rosin: My mother doesn’t have $10,000.

Rosin (AI): Hey, I hope I’ve the correct quantity. This can be a voicemail for the parents working the Cascade PBS Concepts Pageant. I’m working late for the time being and questioning if I’m going to make it. Truthfully, I really feel like I ought to simply skip it. I can’t stand speaking to that Charlie-whatever character. Why am I even right here? Washington, D.C., is clearly the superior Washington anyway.

[Crowd boos]

Rosin: Oooh. Yeah, okay, okay. Now, I might say I used to be speaking too quick.

Warzel: So one factor I did with my voice is I had it say an entire bunch of far worse issues, like, COVID got here from a—no matter, you understand, simply to see what these issues could be like. And so they had been form of plausible, no matter.

But in addition, what if then you definitely took audio—so the one from jail, proper? What should you took audio—your producers, our producers are nice—and inserted loads of noise that sounded prefer it was coming from a crowd, or like a slamming of a cell door or one thing like that within the background, light it in properly? That might be sufficient to ratchet it up, proper?

And I believe all these issues can grow to be extraordinarily plausible should you layer the correct context on them.

Rosin: Proper. You realize what, Charlie? Right here’s the very last thing. You, as somebody who’s been actually near this, fluctuate between, Okay, we don’t must be that alarmed. It’s solely obtained these small makes use of, and, But in addition, it’s obtained these makes use of, they usually’re actually scary.

Having been near this and gone via this expertise, is there a phrase you’ll use to sum up how you’re feeling now? As a result of, clearly, it’s unsure. We don’t truly know—we don’t understand how shortly this expertise goes to maneuver.

How ought to we really feel about it?

Warzel: I believe disorientation is the phrase as a result of—so an enormous purpose I wished to go discuss to this firm was not simply due to what they had been doing, however to be type of nearer, to get some proximity to the generative-AI revolution, no matter we’re gonna name it. Proper? To see these folks doing it. To really feel like I may moor my boat to one thing and simply really feel like—

Rosin: You might have management.

Warzel: Yeah, and I perceive what we’re constructing in direction of, or that they perceive what they’re constructing in direction of. And the reply is you could stroll as much as these folks and stare them within the face and have them reply questions and simply form of really feel actually at sea about loads of these items, as a result of there are glorious transformative functions for this. But in addition, I see, you understand, this voice expertise with the opposite generative-AI applied sciences—principally, a great way to consider them is like plug-ins to one another, proper? And persons are going to make use of, you understand, voice expertise with ChatGPT with a number of the video stuff, and it’s going to simply make the web—make media—weirder. Proper?

All the pieces you see goes to be weirder. The provenance of it’s going to be weirder. It’s not essentially all the time going to be worse, proper? However it could possibly be. And it may possibly be higher. However everybody looks as if they’re dashing in direction of this vacation spot, and it’s unknown the place we’re going.

And I simply really feel that disorientation is form of probably the most sincere and truthful means to take a look at this. And I believe whenever you’re disoriented, it’s finest to be actually cautious of your environment, to pay very shut consideration. And that’s what it appears like proper now.

Rosin: We are able to deal with the reality. Thanks for giving us the reality. And thanks, all, for coming in the present day and for listening to this discuss, and be ready to be disoriented.

[Music]

Rosin (AI): Thanks for listening. And thanks to the manufacturing employees of the Cascade PBS Concepts Pageant. That is the AI model of Hanna Rosin talking, as made by ElevenLabs.

This episode of Radio Atlantic was produced by Kevin Townsend. He’s typing these phrases into ElevenLabs proper now and might make me say something. “You could hate me, but it surely ain’t no lie. Child, bye, bye, bye. Bye, bye.”

This episode was edited by Claudine Ebeid and engineered by Rob Smierciak. Claudine Ebeid is the manager producer of Atlantic audio, and Andrea Valdez is our managing editor. I’m not Hanna Rosin. Thanks for listening.

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest Articles