AI tool clones voice using 15 second clip as creators admit its too risky for public use


The creators of ChatGPT have revealed an eerie AI that can clone the human voice using just a 15-second sample, with bosses claiming they can create “emotive and realistic” speech.

However, the firm admits the technology is too risky for public use with dozens of general elections taking place across the globe this year.

OpenAI has already demonstrated that its technology can rapidly generate text, images and more recently videos.

The artificial intelligence company unveiled its new Voice Engine technology Friday, just over a week after filing a trademark application for the name.

Experts fear voice-cloning tech could also be used to mislead voters, impersonate candidates and undermine elections.

READ MORE: Most teachers think AI is too ‘inaccurate’ to use despite Rishi Sunak drive

The firm says it is “choosing to preview but not widely release this technology” for the time being.

“We hope this preview of Voice Engine both underscores its potential and also motivates the need to bolster societal resilience against the challenges brought by ever more convincing generative models,” OpenAI wrote in a blog post.

“We recognize that generating speech that resembles people’s voices has serious risks, which are especially top of mind in an election year.”

The company also stressed that early Voice Engine testers have agreed to not impersonate a person without consent and to notify people when the voices they hear are generated by AI.

The company, best known for its chatbot and the image-generator DALL-E, took a similar approach in announcing but not widely releasing its video-generator Sora.

However, a trademark application filed on March 19 shows that OpenAI likely aims to get into the business of speech recognition and digital voice assistant. Eventually, improving such technology could help OpenAI compete with the likes of other voice products such as Amazon’s Alexa.

Leave a Reply

Your email address will not be published.