Consumer Reports finds popular voice cloning tools lack safeguards

By TechCrunch | Created at 2025-03-10 14:49:57 | Updated at 2025-03-10 18:51:17 4 hours ago

Several popular voice cloning tools on the market don’t have “meaningful” safeguards to prevent fraud or abuse, according to a new study from Consumer Reports.

Consumer Reports probed voice cloning products from six companies — Descript, ElevenLabs, Lovo, PlayHT, Resemble AI, and Speechify — for mechanisms that might make it more difficult for malicious users to clone someone’s voice without their permission. The publication found that only two, Descript and Resemble AI, took steps to combat misuse. Others required only that users check a box confirming that they had the legal right to clone a voice or make a similar self-attestation. 

Grace Gedye, policy analyst at Consumer Reports, said that AI voice cloning tools have the potential to “supercharge” impersonation scams if adequate safety measures aren’t put in place.

“Our assessment shows that there are basic steps companies can take to make it harder to clone someone’s voice without their knowledge — but some companies aren’t taking them,” Gedye said in a statement.

Read Entire Article