Audio deepfakes of heads of state can trigger market crashes before verification is possible

defense+20 views
A cloned audio clip of a head of state announcing military action or economic sanctions can propagate through financial news networks and algorithmic trading systems in under 60 seconds, triggering market movements before any human can verify authenticity. In 2023, a fake Pentagon explosion image briefly moved markets. Audio deepfakes are harder to detect than images because there is no visual artifact to inspect. This persists because financial markets are optimized for speed (HFT operates in microseconds) while deepfake verification requires minutes, and no circuit breaker mechanism exists for unverified audio attributed to state leaders.

Evidence

https://www.reuters.com/technology/fake-pentagon-explosion-photo-goes-viral-social-media-2023-05-22/

Comments