Latest
|
Regulation
Enterprise
|
Call Center Tech
Latest
|
Global Fintech
Enterprise
|
AI in Healthcare
Latest
|
Macro Economy
Latest
|
Consumer Finance
AI
|
Latest LLMs
CX/CS
|
Fintech
Latest
|
AI Infrastructure
Enterprise
|
ROI of AI
AI
|
Ethics & Safety
Latest
|
Politics & Policy
AI
|
Enterprise AI
AI
|
Big AI
Latest
|
Consumer Banking
Latest
|
Fintech Funding
AI
|
AI in Fintech
CX/CS
|
Fintech
AI
|
Health Tech
AI
|
AI Governance
Latest
|
LLMs
Latest
|
Fintech
AI
|
Open Source
AI
|
AI Security
Enterprise
|
Cloud Security
Latest
|
Macro Economy
Enterprise
|
Enterprise Solutions
AI
|
GRC
AI
|
AII Adoption
AI
|
AI Ethics
AI
|
Healthtech
CX/CS
|
AI in CX
AI
|
Quantum Computing
AI
|
Cybersecurity
Latest
|
Healthtech
CX/CS
|
AI Adoption
AI
|
AI
AI
|
Safety and Compliance
Latest
|
Big Tech
AI
|
Consumer Tech
AI
|
AI Ethics and Risks
CX/CS
|
AI
Enterprise
|
Data and Privacy
Latest
|
LLMs
Latest
|
Banking and Blockchain
AI
|
Healthtech
Enterprise
|
AI in the Enterprise
AI
|
AI Risk and Compliance
AI
|
AI Arms Race
Enterprise
|
AI
Latest
|
LLMs
CX/CS
|
Compliance
CX/CS
|
Great CX
CX/CS
|
CS in Blockchain
AI
|
AI News
Enterprise
|
AI
|
CX/CS
|
CX/CS
|
AI
|
CX/CS
|
AI
|
AI
|
Enterprise
|
AI
|
CX/CS
|
CX/CS
|
Enterprise
|
Enterprise
|
Lax verification in voice cloning services raises concerns
.jpg)
⋅
April 10, 2025

Key Points
Consumer Reports uncovers major flaws in the security measures of AI voice cloning companies, highlighting risks of misuse and fraud.
Four out of six evaluated services lack strong verification processes, allowing unauthorized voice replication.
AI voice cloning tools have the potential to supercharge impersonation scams. Our assessment shows that there are basic steps companies can take to make it harder to clone someone's voice without their knowledge—but some companies aren't taking them.
Grace Gedye
Policy Analyst | Consumer Reports
An investigation by Consumer Reports has revealed significant shortcomings in the safeguards employed by AI voice cloning companies, raising concerns about potential misuse and fraud.
Digital deception: Voice cloning technology, while offering legitimate applications such as audio editing and personalized narration, poses serious risks when misused. There have been instances where scammers utilized AI-generated voices to impersonate individuals, including family members in distress, celebrities, and politicians, to deceive victims.
Lax verification requirements: Consumer Reports evaluated six prominent voice cloning services and found that four of them lack adequate measures to prevent unauthorized voice replication. ElevenLabs, Speechify, PlayHT, and Lovo only required users to check a box confirming they had the legal right to clone a voice or make a similar self-attestation, without implementing robust verification processes.
In contrast, Descript and Resemble AI have implemented mechanisms aimed at confirming a speaker's consent before proceeding with voice cloning.
Basic steps not taken: "AI voice cloning tools have the potential to supercharge impersonation scams," Consumer Reports Policy Analyst Grace Gedye said in a statement. "Our assessment shows that there are basic steps companies can take to make it harder to clone someone's voice without their knowledge—but some companies aren't taking them."
Plea to regulators: The rapid advancement and accessibility of AI voice cloning technology has outpaced the development of regulatory frameworks, leading to a gap in consumer protection. Consumer Reports said it wants companies to raise their standards and is calling on state attorneys general and the federal government to enforce existing consumer protection laws and consider whether new rules are needed.