Photo By: Shuttersv/

Something that has been interesting to observe as it pertains to the Russian invasion of Ukraine is the impact that modern technology has had on the situation.

I don’t say this to make light of or trivialize the matter, but as someone who has the privilege of being in a safe place during this massive crisis, I have been able to observe a few instances where the existence of social media and other contemporary technological advances have clearly had an impact on the nature of this conflict, how it is covered, how we perceive it, and so on.

In particular, I want to talk a bit about deep fakes. It’s been reported that a deep fake video of Ukraine President Volodymyr Zelensky is making the rounds, telling those fighting to put down their arms in what is reportedly a less-than-believable fake video.

For those who are not aware, The Guardian defines deep fakes as being fake images and videos that are created using artificial intelligence “deep learning” through the analysis of existing media. Deep fakes are all over the internet, and chances are you have seen at least one. Whether it was covered in the news or made the rounds on TikTok or on other similar apps, deep fakes are everywhere. 

They are also all over films and television these days, with actors being “brought back to life” on screen or dramatically de-aged increasingly often. A notable recent example was in The Mandalorian and The Book of Boba Fett, which both feature Luke Skywalker not played by Mark Hamill, but instead re-created, at least in part, by the use of deep fake technology.

The main point of contention with deep fakes centers around consent, and rightfully so. The vast majority of deep fakes that are created are pornographic, most often featuring female celebrities engaging in sex acts without the invovlement or permission of said celebrity. This is flagrantly immoral and is a clear form of gender-based violence, and that fact should be obvious to everybody.

With that said, I think that this instance in the Russian invasion of Ukraine represents one of if not the most prominent example of this type of technology being deployed for political purposes. As I mentioned earlier, the deep fake was unconvincing, and was discovered immediately, but the fact that something like this was attempted should tell us that it won’t take long before this type of technology will be able to be reliably used, making the fight against disinformation that much harder.

Clear, seemingly unedited video evidence is perceived by many people to be the most reliable form of evidence available, even beyond eye-witness reports. Video, as we’ve all come to believe, can’t lie, nor can it “misremember” what really happened. If something has been recorded on video, we are quick to assume that that recording tells us the whole, unbiased story. With deep fakes becoming increasingly commonplace and effective, that will no longer be the case in the very near future.

Efforts will need to be made to guarantee the validity of all communications, but especially official government communications. To further complicate things, the form of validation will have to come in a way that cannot be replicated by any artificial intelligence. As a result, it’s incredibly difficult to say what that should look like, simply because this is such a new problem to have to try and address. 

If we do get to a point where deep fakes are visually indistinguishable from real life, how are we supposed to parse out truth from fiction, especially given the current diffuse media landscape? Even if one source can properly validate that the videos they show from politicians, government officials and world leaders are real, what stops any other media outlet from showing the fake ones?

Clearly, the potential political risks of deep fakes are immense. Without a proper infrastructure in place to validate audio and video from our governments, bad actors (be they untrustworthy states, individuals, or organizations dedicated to misinformation) will be able to benefit immensely from successfully being able to confuse, divide, or potentially compromise a population.

The Russian invasion of Ukraine has caused nations the world over to reconsider their commitment level to defense efforts, including here in Canada, where we are apparently gearing up for a defense spending increase. Hopefully this is not the end of the political awakening we are seeing to the realities of modern warfare.

With things being less safe and stable, and bad actors being more brazen and offensive than in the last couple of decades, we need to make sure that we adequately prepare for the risks associated with deep fakes and other modern technology. If left unchecked, deep fakes in particular could quickly become a serious political threat.