How Can Policymakers Address AI Voice-Cloning Scams?
Event Summary
There has been a concerning rise in nefarious uses of AI-enabled voice cloning recently. Bad actors have targeted families and small businesses with disturbing extortion scams. The scams themselves are not new, the term “virtual kidnapping scam” has been around for many years to describe the ways fraudsters trick victims into paying a ransom to free a loved one they believe is being threatened. But AI has made these scams more sophisticated as the technology can be trained on publicly accessible audio recordings of regular people to make them sound incredibly authentic and believable.
Watch now for a panel discussion exploring how threats from AI voice cloning are likely to develop, what countermeasures exist, and actionable steps policymakers can take to address both national and international threats.
- Learn more at datainnovation.org