'Harry Potter' audiobook narrator Stephen Fry said AI was used to steal his voice, and warned that convincing deepfake videos of celebrities will be next | The Markets Café
  • Privacy Policy
  • Terms of use
  • Press Release
  • Advertise
  • Contact
Friday, July 11, 2025
No Result
View All Result
Subscribe
  • Login
The Markets Café
  • News
  • Politics
  • Markets
    • Stocks
    • Futures
    • Commodities
  • Crypto
    • News
    • Markets
    • NFT
    • DeFi
    • Explained
  • Economy
  • Finance
  • Investing
  • Forex
  • Real Estate
  • Tech
  • VideosHOT
  • Community
  • Charts
  • News
  • Politics
  • Markets
    • Stocks
    • Futures
    • Commodities
  • Crypto
    • News
    • Markets
    • NFT
    • DeFi
    • Explained
  • Economy
  • Finance
  • Investing
  • Forex
  • Real Estate
  • Tech
  • VideosHOT
  • Community
  • Charts
No Result
View All Result
The Markets Café
No Result
View All Result
  • News
  • Politics
  • Markets
  • Crypto
  • Economy
  • Finance
  • Forex
  • Investing
  • Tech
  • Videos
  • Community
Home Tech

‘Harry Potter’ audiobook narrator Stephen Fry said AI was used to steal his voice, and warned that convincing deepfake videos of celebrities will be next

by Press Room
September 19, 2023
in Tech
102 1
A A
0
21
SHARES
687
VIEWS
FacebookTwitter
  • Actor Stephen Fry issued a warning about AI cloning his voice at the CogX Festival on Thursday. 
  • Fry said his readings of the “Harry Potter” audiobooks were input into an AI and used to create new audio.
  • Fry said it won’t be long before AI is used to create “deepfake videos” of actors without consent. 

LoadingSomething is loading.

Thanks for signing up!

Access your favorite topics in a personalized feed while you’re on the go.

Actor Stephen Fry sounded the alarm on AI at the CogX Festival on Thursday saying the technology had been used to clone his voice to create audio illegally, according to various reports including from Fortune and Deadline.

Fry played a clip at the festival of an AI system that copied his voice to narrate a historical documentary and pointed out that AI is a “burning issue” for the actor’s union SAG-AFTRA, of which he is a member.

“I said not one word of that — it was a machine. Yes, it shocked me,” he told the audience about the audio.

“They used my reading of the seven volumes of the Harry Potter books, and from that dataset, an AI of my voice was created, and it made that new narration.” 

He explained that the audio resulted from a “mashup,” created from a “flexible artificial voice, where the words are modulated to fit the meaning of each sentence.” 

He added: “It could therefore have me read anything from a call to storm Parliament to hard porn, all without my knowledge and without my permission. And this, what you just heard, was done without my knowledge. So I heard about this, I sent it to my agents on both sides of the Atlantic, and they went ballistic —  they had no idea such a thing was possible.”

Fry said he warned his agent that this was just the beginning.

“It won’t be long until full deepfake videos are just as convincing,” he said.

Hollywood actors and writers have been striking for months to express their anger at a number of issues in the industry, including royalties from streaming platforms.

Another top issue during the strike is how studios plan to use AI. Members of the SAG-AFTRA union say the studios they’re negotiating with proposed to pay background actors for an AI day where a scan is taken of them to create a digital likeness and used in future content without compensation or consent. 

“Actors now face an existential threat to their livelihoods with the rise of generative AI technology,” Duncan Crabtree Ireland, the union’s national executive director and chief negotiator said. 

Henry Ajder, an AI expert and presenter, who is on the European advisory council for Meta’s Reality Labs, previously told Insider that AI could be used in the future to create nonconsensual deepfake pornographic images and videos of actors. 

For example, a deepfake app called FaceMega promoted a sexually suggestive advert in March, by using actress Emma Watson’s face superimposed onto the body of another woman.

Read the full article here

Related Articles

Tech

I love this Hoto electric screwdriver, and it’s cheaper than ever for Prime Day

July 11, 2025
Tech

Kindle’s new ad-filtering setting keeps NSFW promos off your lockscreen

July 10, 2025
Tech

Some Switch 2 accessories and upgraded games are on sale for Prime Day

July 10, 2025
Tech

Razer’s got a new version of its popular DeathAdder Pro gaming mouse

July 10, 2025
Tech

Sony’s best noise-canceling headphones are up to 45 percent off

July 10, 2025
Tech

Windows 11’s new Black Screen of Death is now rolling out

July 10, 2025

About Us

The Markets Café

The Markets Cafe is your one stope Finance, Politics and bussines news website, follow us to get the latest news and updates from around the world.

Sections

  • Commodities
  • Crypto Markets
  • Crypto News
  • DeFi
  • Economy
  • Explained
  • Finance
  • Forex
  • Futures
  • Investing
  • Markets
  • News
  • NFT
  • Politics
  • Real Estate
  • Stocks
  • Tech
  • Videos

Site Links

  • Contact
  • Advertise
  • DMCA
  • Submit Article
  • Forum
  • Site info
  • Newsletter

Newsletter

THE MOST IMPORTANT FINANCE NEWS AND EVENTS OF THE DAY

Subscribe to our mailing list to receives daily updates direct to your inbox!

  • Privacy Policy
  • Terms of use
  • Press Release
  • Advertise
  • Contact

© 2022 The Markets Café - All rights reserved.

No Result
View All Result
  • News
  • Politics
  • Markets
    • Stocks
    • Futures
    • Commodities
  • Crypto
    • News
    • Markets
    • NFT
    • DeFi
    • Explained
  • Economy
  • Finance
  • Investing
  • Forex
  • Real Estate
  • Tech
  • Videos
  • Community
  • Charts

© 2022 The Markets Café - All rights reserved.

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
This website uses cookies. By continuing to use this website you are giving consent to cookies being used. Visit our Privacy and Cookie Policy.