AI deepfake video calls now defeat the 'video verify your date' safety advice

safety0 views
The standard safety advice for online daters has long been 'do a video call before meeting in person to confirm they are who they say they are.' As of 2025-2026, scammers can appear on live video calls using completely fabricated faces generated in real-time by AI deepfake tools. In a Barclays 2026 study, 44% of respondents said they could not confidently detect voice cloning, and 41% could not detect fake video. Only 46% of people correctly identified AI-generated photos in testing. This matters because the single most trusted verification method that ordinary people rely on (seeing someone's face on video) is now compromised, and there is no consumer-accessible tool to detect live deepfakes during a call. The structural reason is that deepfake generation technology is widely available and cheap, while deepfake detection technology requires specialized infrastructure that no dating app or consumer video tool currently integrates.

Evidence

Barclays Feb 2026 study: 44% cannot detect voice clones, 41% cannot detect fake images/video: https://home.barclays/news/press-releases/20260/02/ai-deepfake-concerns-see-gen-z--swiping-left--on-dating-apps/ | Washington Times (Mar 2026): AI-generated pictures and voices driving surge in scams: https://www.washingtontimes.com/news/2026/mar/17/ai-generated-pictures-voices-drive-surge-online-dating-scams-cyber/ | Norton 2026: scammers now appear on live video calls using fabricated faces: https://us.norton.com/blog/online-scams/ai-online-dating-scams

Comments