Start your day with intelligence. Get The OODA Daily Pulse.
In a troubling development, war conditions are potentially acting as an accelerant in the adoption rate of deepfakes.
Президент РФ обьявил о капитуляции россии. Русский солдат, бросай оружие и иди домой, пока жив! pic.twitter.com/5wWC3UlpYr
— Serhii Sternenko ✙ (@sternenko) March 16, 2022
The above-displayed video was created by manipulating a real video of Putin that was originally posted by the Kremlin on Feb. 21. While Putin’s movements in the edited video and the genuine video generally matched up, the edited version manipulated Putin’s mouth to make it appear as if the fabricated audio was truly coming from the Russian president. This is most noticeable during portions of the video where the real Putin is silent and the fake Putin is speaking.
According to Reuters: In the video, Putin appears to say, “We’ve managed to reach peace with Ukraine” and goes on to announce the restoration of independence of Crimea as a republic inside Ukraine. A tweet sharing the video with a caption in Ukrainian reads in translation, “The President of the Russian Federation announced the surrender of Russia. Russian soldier, drop your weapons and go home while you’re alive!”
Here’s a side-by-side comparison of the two videos.
Source:
Sam Gregory, an AI and disinformation expert with the international non-profit organization WITNESS, said this is the first time that these deepfakes were created and spread as part of a war propaganda effort. Gregory told Euronews: “This is the first deepfake that we’ve seen used in an intentional and broadly deceptive way … It’s not an effective deepfake, partly because it’s not an extremely well-made deepfake, but also because Ukraine has done a masterful job pre-bunking and then swiftly rebutting the video.” (1)
The video was circulated on social media a few days after a deepfake of Ukrainian President Volodymyr Zelenskyy was shared online.
The following deep fake of the Ukrainian President has also been circulating recently:
A deepfake of Ukrainian President Volodymyr Zelensky calling on his soldiers to lay down their weapons was reportedly uploaded to a hacked Ukrainian news website today, per @Shayan86 pic.twitter.com/tXLrYECGY4
— Mikael Thalen (@MikaelThalen) March 16, 2022
President Zelenskyy replied immediately:
#Ukraine Hackers published a deep fake of @ZelenskyyUa urging citizens to lay down their arms. He responded immediately:
"If I can offer someone to lay down their arms, it's the Russian military.Go home.Because we're home. We are defending our land, our children & our families." pic.twitter.com/TiICf3Z5Te— Hanna Liubakova (@HannaLiubakova) March 16, 2022
1/ Earlier today, our teams identified and removed a deepfake video claiming to show President Zelensky issuing a statement he never did. It appeared on a reportedly compromised website and then started showing across the internet.
— Nathaniel Gleicher @[email protected] (@ngleicher) March 16, 2022
According to the BBC: In a Twitter thread, Meta security-policy head Nathaniel Gleicher said it had “quickly reviewed and removed” the deepfake for violating its policy against misleading manipulated media. The Ukrainian Center for Strategic Communications had [also] warned the Russian government may well use deepfakes to convince Ukrainians to surrender.” (2)
“The Ukrainian government had warned in remarks on Facebook just two weeks ago that it believed Russian President Vladimir Putin would deploy deepfake technology as part of his attempts to overthrow Zelensky’s government.
A translation of the Ukrainian government statement is as follows: ‘Imagine seeing Vladimir Zelensky on TV making a surrender statement. You see it, you hear it – so it’s true. But this is not the truth. This is deep fake technology. This will not be a real video but created through machine learning algorithms. Videos made through such technologies are almost impossible to distinguish from the real ones. Be aware – this is a fake! His goal is to disorient, sow panic, disbelieve citizens, and incite our troops to retreat. Rest assured – Ukraine will not capitulate! Russia remains only to invent a fejkovu victory, close the Internet, and all contacts with the other world’.” (3)
Exponential CGI Realism? The strategic question now becomes: how quickly will the computer graphics and animation technology exponentially improve to the point that digital platform policies and governmental warnings are not rapid enough to mitigate the real-world impact upon distribution via social media of a “deeply realistic deepfake” in the foreseeable future? Think the evolution of special effects of Star Wars – Episode IV: A New Hope to the eye-popping CGI in the latest Marvel Cinematic Universe release – on an exponentially expedited schedule with real-world geopolitical risks and impacts. A troubling timeline.
Blockchain and Provenance: That which fuels the NFTs craze can be applied to deepfakes. According to the World Economic Forum, Blockchain can help combat the threat of deep fakes. Here’s how.
The Coalition for Content Provenance and Authenticity (C2PA): According to the organization’s website, “the C2PA is a Joint Development Foundation project to collectively build an end-to-end open technical standard to provide publishers, creators, and consumers with opt-in, flexible ways to understand the authenticity and provenance of different types of media. C2PA opposes efforts to make content provenance measures mandatory.” In January, the C2PA announced a partnership with the technology sector – including companies such as Microsoft, Intel, BBC, The New York Times, Twitter, Nikon, Akamai, Fastly, and Adobe – to launch a new standard designed to address the development of deepfakes. As first reported by The Record back in January:
“The first-of-its-kind specification grants content creators a means to develop what the organization calls “tamper-evident” media. It will work by providing platforms a way to define who created and changed information associated with images, videos, documents, and other assets, as well as identify evidence of manipulation. Andy Parsons, Senior Director of the Content Authenticity Initiative at Adobe, told The Record, ‘The C2PA expects its open standard to be broadly adopted across the content ecosystem – by device makers, news organizations, software and platform companies.’
In a live Q&A on Wednesday hosted by C2PA, lawmakers stressed the urgency around stopping the spread of deepfakes, which Sen. Robert Portman (R., OH) described as ‘hyper-realistic content depicting events that actually did not occur.’
Intel clarified that the specification alone will not completely rid the internet of deepfakes. Jennifer Foss, Communications Manager for Intel Corporation, told the record, ‘Provenance information is just one important piece that can be used in conjunction with many other technologies including deep fake detection technologies.’
Sen. Gary Peters (D., MI) said at the Wednesday panel that he considers deepfakes ‘a critical national security issue,’ referring to the manipulation of media content (images, videos, audio files, and documents) through artificial intelligence. The escalation of this tactic poses serious threats to consumers, including the spread of disinformation, weakened trust between users’ and companies, and a means to target women, the lawmakers said.” (4) The C2PA launched in February of 2021.
To review C2PA case studies, see Case studies — Content Authenticity Initiative.
The C2Pa Foundational White Paper: The Content Authenticity Initiative: Setting the Standard for Digital Content Attribution
A Review of Visual Technology in 2021
A new tech standard aims to combat deepfakes – The Record by Recorded Future
(1) Putin Deepfake Imagines Russian President Announcing Surrender | Snopes.com
(2) Deepfake presidents used in Russia-Ukraine war – BBC News
(3) Deepfake Of Zelensky Calling On Troops To Surrender Surfaces Online (dailydot.com)
(4) A new tech standard aims to combat deepfakes – The Record by Recorded Future
It should go without saying that tracking threats is critical to informing your actions. This includes reading our OODA Daily Pulse, which will give you insights into the nature of the threat and risks to business operations.
Now more than ever, organizations need to apply rigorous thought to business risks and opportunities. In doing so it is useful to understand the concepts embodied in the terms Black Swan and Gray Rhino. See: Potential Future Opportunities, Risks and Mitigation Strategies in the Age of Continuous Crisis
The OODA leadership and analysts have decades of experience in understanding and mitigating cybersecurity threats and apply this real world practitioner knowledge in our research and reporting. This page on the site is a repository of the best of our actionable research as well as a news stream of our daily reporting on cybersecurity threats and mitigation measures. See: Cybersecurity Sensemaking
OODA’s leadership and analysts have decades of direct experience helping organizations improve their ability to make sense of their current environment and assess the best courses of action for success going forward. This includes helping establish competitive intelligence and corporate intelligence capabilities. Our special series on the Intelligent Enterprise highlights research and reports that can accelerate any organization along their journey to optimized intelligence. See: Corporate Sensemaking
In 2020, we launched the OODAcast video and podcast series designed to provide you with insightful analysis and intelligence to inform your decision making process. We do this through a series of expert interviews and topical videos highlighting global technologies such as cybersecurity, AI, quantum computing along with discussions on global risk and opportunity issues. See: The OODAcast