top of page
Writer's pictureEuropean Waves

Interviewing Tim Weber: Russia, Disinformation, and the Anger in Us


Greta Scott, Aria Guevara, and Paulina Frank interviewed Tim Weber, Head of Editorial and Executive Director at Edelman and former Business and Technology Editor for the BBC News Website, on Russian disinformation and the role of social media in the war on Ukraine.



Greta Scott (GS): Russia has invested billions of roubles in its disinformation campaign. I saw online that you have described both PR and journalism as “storytelling” activities. But how does disinformation work? And how effective are disinformation and communication strategies?


Tim Weber (TW): Disinformation is storytelling, that's the whole purpose of it. Disinformation is about putting out multiple narratives – they can be completely contradictory - to sow confusion, to create distrust, to make people think, “Well, not everything is quite certain - this person says this, that person says that,” and so on. There was a time when most countries had one fairly agreed narrative – that's because they listened to the BBC, they watched the Tagesschau in Germany at 8 PM, they watched Good Morning America, and the whole nation came around this one source of news and truth. For that part, in the Soviet Union, they might have watched Pravda (Truth) or Izvestia (The News). In Soviet times, there was this joke: there's no pravda in Izvestia and there’s no izvestia in Pravda.


So the whole point of disinformation is usually not to convince people of something else – there's no one out there who wants to convert the world to Putinism, or very rarely will they try to do that. At the start of the war, when they still thought they would win in three days or maybe three weeks, there was an attempt to make Russia look strong, or Putin – the man on his horse with his naked upper body and virility and all this – and they were showcasing how Russian guard regiments were approaching Kyiv. But very, very quickly, as the campaign bogged down, they flipped and went back to the original disinformation campaign. And it's one that goes way back. It goes back to 2014, of course, to the attack on Crimea. There was a disinformation campaign regarding what was really happening on the island and peninsula. It went all the way back to when Russia was spreading disinformation about the West and about Ukraine and all these states.


So what is the point of disinformation? It is putting out so many different competing stories that people feel they are no longer certain about the truth. Once you have so many rival stories, the truth gets lost in this avalanche of fake storylines. The most clever thing about storytelling is that every story, even every fairy tale, has a nugget of truth in it. That nugget of truth can be a Prince and Princess, it can be some nasty miller or some wicked old person who's perceived to be a wizard or witch or something, but there's always something that people can relate to. Here again, in a disinformation campaign, they take something that is seemingly real or is at least acknowledged somewhere, and blow it up and spin a tail out of it.


Here, some forms of western journalism become unwitting accomplices of disinformation. If you look at, say, the Brexit Campaign in the UK, a lot of media organisations said, “Well, some people say this is happening, some people say this might happen, who knows?” and they were obviously spinning it in one way or another. Some people say “The Ukrainians are Nazis”, others say “No, they're not.” So if you have this false equivalence, instead of saying, “OK, what's really going on?”, then the disinformation campaigners can say, “Oh look, even the BBC is saying, ‘It could be that they're Nazis.’” And then from there, they spin this disinformation. And there's a good quote, it's not mine: “When a journalist is writing about the weather, they shouldn't say, ‘Some people say it's raining and other people say it's not raining, the sun is shining.’ It's their job to look out of the window and check what the weather is like.” The same is true of how journalists should deal with disinformation: they should just check and instantly as they report it say, “Some people are saying this, but the reality is actually that.” We shouldn’t have this false equivalence, because false equivalence, this forced storytelling, becomes the core of the disinformation.


GS: If you don’t at all acknowledge the other side, do you then risk sounding like you’re completely ignoring them, and then that could stoke up anger even more?


TW: Yes, you have to quote the other side. That is OK, but if the other side blatantly lies, you just say so. Because otherwise the next time the BBC covers the launch, say, of a NASA rocket to the moon, will they then interview someone from the Flat Earth Society saying, “That's complete nonsense. That can’t be. The moon is just dangling from a fixed ceiling over the earth, and the earth is flat.”? No, of course not. The fact is, that's exactly what the BBC did for climate change. They desperately sought something to balance it out, even though the evidence and the number of experts was overwhelmingly in the majority on the other side. Journalism has a responsibility to be objective, to give all sides their fair hearing. But objectivity doesn't mean not countering lies and not putting things into context.


Context is the important thing. Nearly every fact can be seen in multiple ways. If you look at the anti-Vaxxer campaign (which, by the way, was also dramatically boosted by Russian disinformation campaigns that wanted to undermine western society), you can of course interview a distraught mother whose daughter died shortly after vaccination. And the vaccination may have caused it. But you can either do that and have a whole programme about this mother and give the impression that vaccinations kill, which they can. Or you say, there is this very tragic case, but it is the absolute exception, and if we don't have the vaccination, then many people will die. For society as a whole, maybe a few people will die of vaccinations because of a reaction they have. But if you don't do vaccinations, multiples of that will die. We have the numbers for that, we've seen this in countries where there were no vaccination campaigns. So journalism is about providing context and therefore reporting in the right way.


Again, this is how we can fight disinformation as journalists, by providing context. The problem, however, is that as a fight against disinformation, that is not enough. A quote attributed to Churchill is: “A lie will be halfway around the world while the truth is still putting its trousers on.” That's exactly the problem with all the fact-checking. Fact-checking is important, absolutely vital, but it's always late. And therefore it is important for journalists to be in as real-time as possible to provide the context and check the facts, but most importantly if we want to fight disinformation, what the Biden government did in the run-up to the Ukraine War was probably the most important thing ever, which was what's known as pre-bunking - not debunking disinformation, but pre-bunking it: expecting what will happen and therefore making it invalid. For example, saying, “It's very likely that the Russians will use false flag operations to have a just cause for the war.” The Russians created a few terrorist incidents before in order to justify them sending troops into Donetsk and Luhansk.


And so pre-bunking, inoculating people against disinformation helps. But a big problem is that that may work in the West, under governments which have some degree of credibility, but, of course, it doesn't work around the world. Here's the fundamental thing about disinformation campaigns: very often, they are so successful because they don't rest on absurd statements, they come across dressed in local clothing. They look like real people, like locals - sometimes they are locals, especially in Africa, sometimes they are paid for. We know that the Wagner Group (Putin’s group of mercenaries) is paying influencer networks in Africa, especially in countries like Mali. But more importantly, they can be trolls or bots, but they look like real people. This happens at every single level. For example, in the United Kingdom right now, there's a series of strikes by railway workers, nurses, and so forth. And there was quite a popular tweet making the rounds, from a guy who looked like me, in his 50s, who said, “OK, I'm done with this. Let's put the cards on the table. I'm a train driver, I earn that much, we just had pay rises, this strike is truly ridiculous.” These numbers were all wrong. They were way too high, the pay rises weren’t that high, the salary wasn’t that high. The funny thing is, they found at least eight different accounts where seemingly normal-looking people said verbatim exactly the same thing, claiming to be train drivers. These accounts are the same accounts that were spreading Brexit lies.


So somebody on the either far right or somebody maybe in Russia itself – we know a lot of the campaign was financed by Russia – is trying to sow division in society. I don't know whether it is Russia, but it is certainly happening, it is happening at every level. You find similar things happening with the war in Ukraine. And here, interestingly enough, a lot of this messaging isn't actually about the war in Ukraine at all. It’s saying, “Why is the West bothered at all with Ukraine when they do not tackle this issue, don't talk about that issue?” This is what's known as “whataboutism,” and that is actually one of the most powerful disinformation campaigns because it makes people disengaged. “Why should I care about that, when nobody cares about something I feel equally strongly about? Why should I care about Ukraine, when nobody cares about what's happening in Tigray?” There, they play all sides: they play the Tigray side, they play the anti-Tigray side, and on and on and on. The point of the disinformation campaign is not to make people feel something about it, it is to confuse them, to sow doubt, to make people not act. And sometimes that has a devastating impact.


GS: I have read that Tweets containing false information were 70% more likely to be retweeted than accurate tweets. Why are we so susceptible to disinformation? What would you say are our biggest vulnerabilities in terms of being exploited by disinformation?


TW: Disinformation preys on our prejudices. We know from analysis of Facebook, Twitter and so on, that engagement is prevalent when something annoys us. This is why positive, good news does so poorly. There's a dopamine hit: “Oh isn’t that a cute Kitty?,” and then it goes away. You need a lot of cute, pretty pictures to stay engaged. But if there’s something that really annoys you, then the dopamine levels in your head have a much stronger reaction, much stronger engagement, and you get much angrier. You're much more engaged about it and usually angry. This is why the disinformation campaigns, for example, in the run-up to Trump's victory, were so successful. Because they were trying to polarise people, they were not trying to say to people, for example, “Black Lives Matter”, but they were trying to make people upset about some alleged radical excess of Black Lives Matter, or make Black communities upset about some radical thing happening elsewhere. Again, it's about dividing society. Fake news takes a kernel of truth and exaggerates it, and plays exactly into whichever prejudices, whichever hot button touchpoint there is.


They try all this out. A good troll will constantly fine-tune their messages and see where they get the most engagement. Once they’ve found it, they will go large with it. This is exactly what journalists do. There's a story you think should be performing well, but the headline isn’t doing well (because you see that live on the heat map of your website), so you change the headline a little bit and see if it now gets more clicks. If it does, you’ve learnt something and you use it the next time (hence all the clickbait headlines that have emerged). Troll bots have taken that to the extreme. Where do you get the biggest reaction, the most shares, the most engagement? That's what you go for, that's what you repeat. And the more troll accounts you have at your disposal, the more you can do that.


Aria Guevara: On media platforms like Reddit, there are threads of content where people have posted uncensored images and videos of the war which makes the gruesome realities of war accessible to anyone, but some argue that it could contribute to desensitising people. Do the benefits of having images and videos public in this way outweigh the potential consequences?


TW: Reddit isn't the problem. The problem is platforms like Telegram because actually most of these videos originate from places like Telegram. There are channels on Telegram with hundreds of thousands of subscribers that publish the most gruesome images you can imagine, and some of them do it for what they claim to be a good cause. For example, an account called “Spook” posts pictures of what he calls “Cold Boys” and “Burnt Boys,” which refers to dead or burnt Russian soldiers. What he wants to do is called “open-source intelligence”, to document what is really happening on the field. Because of that, we know to some extent the true size of Russian losses, both in hardware and, to some extent, in people. I do not recommend that you go to their Telegram, but I've had a look at it and it is gruesome. It shows war how it really is. And so maybe that is useful for a few people with romantic notions about what war is, because it isn't [romantic]. It is really gruesome. It is important for something like that, so it's positive.


But at the same time, we should never underestimate the potential depravity of humans. If you look at some of these channels, there are, for example, these images published of dead Ukrainian civilians, dead Ukrainian children. The way that is celebrated by Russian speakers on these channels with likes and heart images, commenting on a two-year-old who died somewhere in Kyiv, that is just absolutely sickening. So, like everything, it's double-edged. I believe that the people who go through these channels for open source intelligence probably all need counselling, because they will have post-traumatic stress disorder, just like war correspondents have, like soldiers have, more importantly. But there are sadly a few people out there who kind of get off from this kind of stuff. Should you ban it? I doubt it. People have tried to ban pornography for ages and it still gets through. So I think it's impossible. It is just like everything else, a double-edged sword.


Paulina Frank (PF): How do you interpret the strong support for the Russian military? Would you blame the intense use of propaganda in Russia as the sole factor for the strong support for and within the Russian military?


TW: I think Russian society is a very special case. Russian society has been subject to disinformation since Soviet times, and this goes back decades. Russian society has over decades had drilled into it the superiority of Russian culture, Russia being a benign empire (even though Russia was a colonialist power across the whole of Asia and of course also into Ukraine), that it never was colonialist, that it was always a benign empire, but it should be an empire. That was kind of drilled into people. Russia has for decades (and it goes all the way back to Stalin) been quite dismissive of Ukrainians, seeing them as a kind of inferior group of people. If something is deeply entrenched in a culture, it is really difficult to take it away. So right now, what we see is that Russians, when they're protesting, they're not protesting against the war – the war is perfectly fine, because “how dare these Ukrainians not want to be part of Mother Russia” – the only thing they're upset about is that their men don't have enough equipment, and have old weapons and food that was produced a decade ago.


If you look at German society up to 1945 and actually even beyond, it took a devastating shock to get some of the German imperialism and German anti-Semititism out of society. Even today it's not gone. 3000 police officers were conducting raids across Germany to catch a far-right terrorist cell that was trying to plan a violent overthrow of the government. They’re anti-Semites, far-right, mentally before the Nazis. It's 2022, and they want to go back literally to 1871 – it is insane. But it’s really difficult to get these big cultural strands out of the mentality of people. For Russia, I wouldn't even say it's disinformation campaigns. It is embedded deeply in the culture, it has been instilled through propaganda over generations. It goes all the way back to Pushkin, to these people who were “Great Russians” (“Great” as in they wanted a Greater Russia).


PF: News coverage of the war on Ukraine is heavily restricted in Russia and the state media reflect the official position of the Russian government. Could the partial mobilisation of soldiers in Russia encourage people who have newly acquired first-hand experience of the war to rethink their position on the invasion?


TW: It hasn't had an effect on their support for the war in principle. The only thing that they're upset about, seemingly in all the demonstrations, but also in other statements, is that their boys aren't equipped well enough, so support for the war may be undermined, but not for the reason one might hope. The war aims are perfectly fine, seemingly, with the majority of Russians, it's just the conduct of the war they’re unhappy with. Some of the analysts on state television bemoan the lack of weaponry or hope for the arrival of some miracle weapons that will destroy the West. They're still fantasising about occupying Warsaw and Berlin, that's what Russia should do, and obliterate London with nuclear weapons and all that nonsense. They're not capable of it anymore, they probably never were. I think a lot of that is reflected in the Russian public: they don't want to die themselves, but if they had good weaponry, then I think they would happily march into war, sadly.


Comments


bottom of page