And while, in real life, Ukraine’s president Volodmyr Zelensky has been staunchly defiant, that may not be the case online.
Officials in the Ukranian goverment are concerned Russia will start circulating deepfake videos of their leader appearing to surrender.
Warnings posted by the Ukrainian Center for Strategic Communications and Information Security on Facebook, urge soldiers and civilians not to believe any video they see showing President Zelensky announcing a surrender.
For the latest updates on the Russia-Ukraine war, visit our live blog: Russia-Ukraine live
The warning says that a deepfake video from Russia would be ‘almost impossible to distinguish from the real ones’ that Zelensky has been recording since the start of the invasion.
‘You see it, you hear it – so it’s true. But this is not the truth. This is deepfake technology,’ reads the Facebook warning when translated.
‘This will not be a real video, but created through machine learning algorithms. Videos made through such technologies are almost impossible to distinguish from the real ones.
‘Be aware – this is a fake! His goal is to disorient, sow panic, disbelieve citizens and incite our troops to retreat. Rest assured – Ukraine will not capitulate!’
The threat of deepfake videos became apparant during the 2020 election campaign in the US and, as a result, Facebook banned them from its site.
Tech giant Microsoft has also warned of the possibility of Russia deploying the tactic.
In a blog post outlining its involvement in conflict, Microsoft said it was trying to stop ‘state-sponsored disinformation’.
‘The past few days have seen kinetic warfare accompanied with a well-orchestrated battle ongoing in the information ecosystem where the ammunition is disinformation, undermining truth and sowing seeds of discord and distrust,’ the blog post said.
The term ‘deepfake’ is a portmanteau combining the words ‘deep learning’ and ‘fake’, referring to simulated videos of human beings.
The tech uses machine learning or artificial intelligence to combine digital images with audio files to make convincing-looking fake footage of people uttering words they’ve actually said in real life.
When done well, it can be incredibly difficult to tell fact from fiction.