
Still, the testers were able to bypass those safeguards, prompting the magazine to call for tighter standards. “We argue that the nascent voice-cloning industry should adopt norms and standards to mitigate the risk of fraud,” Consumer Reports said.
GenAI tools have made voice-cloning a possibility, and it is increasingly being used for fraud.
Scammers use the tools as a form of social engineering to cheat victims out of money or to spread misinformation. In many cases, they create realistic audio of a close relative or friend in trouble and deceive callers to fraudulently send money or to divulge sensitive information.