Deep fake technology that creates realistic but fake audio and video of real persons has the potential to become a severe propaganda and blackmail threat to U.S. troops and others. Earlier this year, a photo was edited to show American soldiers ignoring a dead child they had supposedly killed. While the image was quickly proven to be a fake, deep fake technology has the potential to create difficult-to-disprove video, audio, and images that can be used extremely effectively in propaganda and blackmail campaigns. Capabilities of editing have improved to the extent that they may become available “off the shelf” at levels that “could fool even digital forensic experts.” Such technology could be easily used to engineer videos of a “compromising nature” and used for blackmail by foreign intelligence services.
About OODA Analyst
OODA is comprised of a unique team of international experts capable of providing advanced intelligence and analysis, strategy and planning support, risk and threat management, training, decision support, crisis response, and security services to global corporations and governments.