Members of the European Parliament fall victim to mock encounters with the Russian opposition
When Rihards Kols, Chairman of the External Committee in the Latvian Parliament, received an email in March from Leonid Volkov (Chief of Staff of Opposition Leader Navalny), the alarm bells won’t ring. There is absolutely no reason for this. Volkov wants to consult with him. “These types of requests are normal,” says Kols. Not much later, they actually make a short video call. These include the annexation of Crimea and Russian political prisoners. Volkov would like to thank Latvia, among other things, for its support and strong position regarding European sanctions.
When Kols shuts down his laptop, he still feels nothing strange is happening. “The conversation was vague and somewhat brief,” he later recalled, “but maybe that was the only thing unusual about it.” Not until weeks later Realize Politician is a victim of deception. This realization comes when he hears from his Ukrainian colleagues about a video interview with the fake Volkov. Unlike Kols, his behavior in this conversation was very strange and ‘overtly provocative’.
Kols appears to have had a video chat with what’s called deepfake, which is an AI-generated animation that, as it were, was stuck on someone else’s head. In this case, then, Volkov’s head is on the head of the man Coles was actually talking to. Politicians from Estonia, Lithuania and the United Kingdom have also been contacted in this way.
Critics have warned for years that such tactics have emerged that could seriously undermine confidence in politics. from After the fact to Beyond reality: We have long had doubts about the veracity of the statements, and now we don’t even know if a statement has actually been made on the video. Because by using deepfake techniques, it’s possible to get every celebrity to say anything in a video. There are many examples, and they are increasingly convincing. At the same time, all kinds of accessible applications are being thrown into the market (Wombo for exampleEveryone can work with themselves.
New step
What has happened now to the fake Vokov is a worrisome new move, says Jarno Dorsma, a technology trends watcher, who is personally testing a lot with deepfakes. We all knew this was going to happen one day, and now it turns out we’re in the thick of it. This has been done very convincingly. The difference with Wombo-like apps for home use is that they are just for fun: “Everyone immediately sees that something is wrong. It makes them fun, too.
And unlike previous examples with Obama or Trump, for example, the fake Volkov model was not prefabricated. He performed a live, real-time video call. Dorsma warns that this is what makes it so likely: “We are in a world of videoconferencing. Politics too. You can no longer assume that the person in the meeting is also the person who is pretending to be. Dorsma says it’s the work of the experts:“ It takes a long time to act. On this, perhaps by government agencies.
The Kremlin under Putin is very weak and afraid of strength Embed a Tweet They hold mock meetings to discredit Navalny’s team. They have reached me today. They won’t broadcast the parts that Putin call a murderer and a thief so I’ll put them here.
TomTugendhat April 21, 2021
British politician Tom Tugendhat also hints at this: “Putin’s Kremlin is so afraid of Navalny that they are now organizing mock meetings to discredit Navalny’s team.” Coles calls it a “painful lesson. Unpleasant and uncomfortable. But he also writes:“ Maybe we should thank the fake Volkov for that lesson. ”(The real) Volkov finally leaves on Facebook That deepfakes are like his real face.
“Pop culture enthusiast. Unable to type with boxing gloves on. Analyst. Student. Explorer.”