Deepfake evidence in military tribunals has no established admissibility standard
defense+2defenseailegal0 views
As AI-generated media becomes indistinguishable from real footage, military tribunals and courts-martial face a crisis of evidence authenticity. Defense attorneys can claim any incriminating video or audio is a deepfake (the 'liar's dividend'), while prosecutors cannot prove provenance of battlefield footage captured on uncontrolled devices. The Uniform Code of Military Justice has no standard for authenticating digital media against deepfake manipulation. This persists because military evidence handling procedures were written for physical evidence and unedited film, the technical expertise to perform forensic media analysis is concentrated in a handful of DoD labs with months-long backlogs, and legal precedent for AI-generated evidence does not yet exist.
Evidence
https://www.armywarcollege.edu/Research/publications.cfm