The FBI has issued a warning about an increase in kidnapping scams where hackers use artificial intelligence to create fake 'proof of life' videos. These scams are becoming more popular among cybercriminals. The alert highlights the evolving tactics in online fraud.
Cybercriminals are increasingly employing AI technology to fabricate convincing 'proof of life' videos in kidnapping scams, prompting a public warning from the FBI. According to the advisory, scammers contact victims' families claiming a loved one has been abducted and demand ransom payments. To build credibility, they generate deepfake videos using AI that mimic the missing person's appearance and voice, making the threats seem authentic.
The FBI notes that these scams have gained traction due to the accessibility of AI tools, allowing even non-experts to produce realistic forgeries. Victims are often urged to act quickly without verifying the claims, leading to financial losses. The agency recommends verifying any such claims through direct, trusted communication channels rather than relying on provided media.
This development underscores the growing challenges posed by AI in cybersecurity, as fraudsters adapt to new technologies to deceive targets. The FBI's alert aims to educate the public on recognizing these tactics and reporting suspicious activities promptly.