Fake kidnappings using AI
In a bizarre “cyber kidnapping” incident, a missing 17-year-old Chinese exchange student was found alive in a tent in Utah’s freezing wilderness this week. He’d been manipulated into heading into the wilderness by scammers who extracted an $80,000 ransom from his parents by claiming to have kidnapped him.
Riverdale police on the cyber kidnapping phenomenon.
While its not yet known if AI was employed in this incident, it shone a light on an increasing trend of fake kidnappings, often targeting Chinese exchange students. The Riverdale police department said that scammers often convince victims to isolate themselves by threatening to harm their families and then use fear tactics and fake photos and audio — sometimes staged, sometimes generated with AI — of the “kidnapped victims” to extort money.
An Arizona woman, Jennifer DeStefano, testified to the U.S. Senate last year that she’d been fooled by deepfake AI technology into thinking her 15-year-old daughter Briana had been kidnapped. Scammers apparently learned the teenager was away on a ski trip and then called up Jennifer with a deepfake AI voice that mimicked Briana sobbing and crying: “Mom these bad men have me, help me, help me.”
A man then threatened to pump “Briana” full of drugs and kill her unless a ransom was paid. Fortunately, before she could hand over any cash another parent mentioned they’d heard of similar AI scams, and Jennifer was able to get in touch with the real Briana to learn she was safe. The cops weren’t interested in her report, labeling it a “prank call.”
Sander van der Linden, a professor of psychology at Cambridge University, advises people to avoid posting travel plans online and to say as little as possible to spam callers to stop them from capturing your voice. If you have a lot of audio or video footage online, you might wish to consider taking it down.
Robotics’ ‘ChatGPT moment’?
Brett Adcock, founder of Figure Robot,…
..