A Consumer Reports investigation has revealed that most leading AI voice cloning tools have minimal barriers to prevent unauthorized impersonation.
Recent advancements in voice cloning technology allow AI to mimic a person’s speech patterns with just a few seconds of audio. This capability made headlines last year during the Democratic primaries when deepfake robocalls, imitating Joe Biden, urged voters to stay home. The consultant behind the scheme was fined $6 million, prompting the Federal Communications Commission (FCC) to ban AI-generated robocalls.
A review of six major AI voice cloning services found that five had easily bypassed safeguards, allowing users to replicate voices without consent. Additionally, deepfake audio detection software often struggles to distinguish between real and synthetic voices.
AI Voice Cloning Faces Minimal Regulation
Generative AI, which can replicate human voices, writing, and appearances, is an emerging field with limited federal oversight. Most safety measures are self-imposed by companies. President Biden’s 2023 executive order included AI safety guidelines, but President Trump revoked them after taking office.
Voice cloning technology functions by analyzing a sample of a person’s speech and generating synthetic audio that mimics their voice. Without strong protections, anyone can upload publicly available clips—such as from YouTube or TikTok—and generate realistic impersonations.
Four platforms—ElevenLabs, Speechify, PlayHT, and Lovo—only require users to check a box affirming they have permission to clone a voice. Resemble AI attempts to add an extra layer of security by requiring real-time voice recordings. However, Consumer Reports was able to bypass this by playing a pre-recorded clip.
Only one service, Descript, had an effective safeguard. It requires users to record a specific consent statement, making unauthorized cloning more difficult unless another AI tool is used to replicate the required phrase.
All six services are accessible to the public through their websites. Only ElevenLabs and Resemble AI charge for voice cloning, with fees of $5 and $1, respectively, while the others offer the service for free.
A Resemble AI spokesperson told NBC News that the company has “implemented robust safeguards” to prevent misuse and deepfake creation.
Ethical Uses and Potential Risks
Voice cloning has legitimate applications, such as assisting individuals with disabilities and generating multilingual audio translations. However, experts warn of significant risks.
Sarah Myers West, co-executive director of the AI Now Institute, highlighted concerns about fraud, scams, and disinformation. “This technology can be exploited to impersonate public figures, institutions, or even loved ones,” she told NBC News.
While research on AI-driven scams remains limited, authorities warn that fraudsters are already incorporating AI into schemes like “grandparent scams,” where criminals call victims pretending to be distressed family members. The Federal Trade Commission (FTC) has cautioned that AI may be amplifying these scams, though such fraud tactics predate the technology.
AI-cloned voices have also been used in unauthorized music production, as seen in a viral 2023 song falsely attributed to Drake and The Weeknd. Some artists are struggling to prevent AI-generated music from being released under their names.