Ex-East StratCom Task Force stalwart Jakub Kalenský on EU efforts vs Russian disinformation
Jakub Kalenský was among the first to join the skeleton staff of the East StratCom Task Force, the European Union’s first direct initiative to identify, debunk and counter Russia’s disinformation campaigns. For the first year or so of the Task Force’s existence, established in the summer of 2015, the Czech former journalist was also the only team member devoted solely to that monumental task.
The situation was such that a year into the project, in March 2017, more than 120 people from across government, media, security and civil society signed on to an open letter accusing the EU of being “irresponsibly weak” on disinformation.
In a wide-ranging interview, Mr Kalenský shared his insights into the nature of Russia’s disinformation efforts and the varying degrees of awareness in Europe about the extent of the problem. I began by asking him how – on a shoestring budget – the team managed at all to make a difference.
“Well, it’s true that the beginnings were a bit tough. We were a team of seven, and we were supposed to cover the three objectives in the action plan approved by the European Council conclusion in 2015 [in which] the 28 heads of state agreed to challenge the ongoing Russian disinformation campaign.”
“How do we do that? These were the three objectives in the action plan. The first of them was better communicating EU policies in the Eastern Partnership countries [Armenia, Azerbaijan, Belarus, Georgia, Moldova and Ukraine] or ‘Neighbourhood’. The second objective was supporting independent media in this region. And it was only the third objective to raise awareness about Russia’s disinformation campaign.”
“So, the majority of my colleagues were working on objectives one and two. And it’s true that in year one, it was only me who was fully dedicated to objective three – although I was usurping quite a lot of their time, because I really needed some help in promoting the products.”
“When you don’t have the resources, you have to adjust to the situation. There was the option that we would only be window dressing – and we didn’t want to be that – so we cooperated a lot with partners outside of Brussels. We tried to identify those people and organisations who were already working on countering disinformation.”
“The role model was definitely the Ukrainian group Stop Fake, who are really the pioneers of countering disinformation via exposing it. And when you expose the disinformation, you have to say why it is disinformation, so there is the element of debunking, fact-checking. So we tried to basically ‘copy-paste’ this model and raise it to the European level. We weren’t 100 percent successful, but it was a step in the right direction, I believe. And we got a lot of praise for that.”What was a typical day like for you in that first month or two? And how did the work evolve over time? You got more help, as you say…
“Well, over the first month we were discussing what we were going to do. In the second month, after the implementation, I think three or four days a week were dedicated to producing the DisinfoReview, our flagship product where we tried to collect and expose as many disinformation messages as we could.”
“If you highlight disinformation when it is still only in the Russian-language media space, you can actually warn other language spaces – we heard it many times from Lithuania, from the Czech Republic, from the UK – that we had shown them disinformation before it was translated into their languages, so they were prepared that it was coming.”
“So that was three or four days per week, and the rest was raising awareness, so speaking at conferences and seminars, publicly trying to engage with the member states; very often they were reaching out to us for advice, because it was a new problem for pretty much everybody in 2015. I think within the European Union, only the Baltic States were experienced enough.”
“We also gave a lot of background briefings for journalists – we didn’t talk to them on the record but were briefing them off the record. And obviously, you have to read quite a lot. Those days spent compiling the DisinfoReview were probably the hardest, because– and you know it yourself from doing journalistic work – verifying and fact-checking is quite time-consuming. And watching the aggressive TV shows on Russian state channels – that can be exhausting!”
Almost like entering a kind of Alice in Wonderland alternative universe, I imagine.
“Maybe more like George Orwell’s ‘Two Minutes Hate’ [from his novel 1984] – but it goes on for three hours!”
At that time when you joined, was there already wide awareness of the now well-publicised Russian ‘troll factories’ – the [St. Petersburg-based] Internet Research Agency and such outfits.
“I believe that in the frontline states there was quite a high awareness about this phenomenon. The brave Russian journalist Lyudmila Savchuk had already infiltrated the Russian troll factory, and she was there on the day that Boris Nemtsov was killed, at the end of February 2015. And she was describing the pattern of work that we have seen so many times since then.”“When Boris Nemtsov was killed, the bosses told them to stop everything and start spreading conflicting stories about what happened. So, person A will say he was killed by the Americans, to blame Putin; person B will say he was killed by the Ukrainians, to blame Putin; person C will say he was killed by the Russian opposition, to blame Putin.”
“And you see that here in these situations, the aim is not really to persuade you about one version of events but actually to exhaust your critical thinking, and so that the average media consumer will end up saying there are too many versions and I don’t know which one is right.”
By some estimates, the Kremlin devotes well over €1 billion a year to its propaganda machine – including via open official channels such as RT television, which has a presence in some 100 countries, and the state news agency Sputnik, which was launched a few months after Russia’s illegal annexation of Crimea from Ukraine and publishes stories in 33 languages.
On top of that, the now infamous “troll farms” that systematically spread disinformation are also heavily financed from the state budget, and their content replicated by thousands of purported “news” websites around the globe. By comparison, the EU’s East StratCom Task Force has an annual budget of under €1 million – a thousand times less.
EU Justice Commissioner Věra Jourová:
“This is not enough. In June this year, the European Council tasked the Commission and the External Action Service to develop an action plan on tackling disinformation. This Action Plan, which we are currently working on, will outline specific proposals for a coordinated EU response to the challenge of disinformation. We will build on existing initiatives, such as the work of the East StratCom Task Force, which was created in 2015 to address Russia's ongoing disinformation campaigns.”
The EU executive arm is now drafting an action plan to be presented to EU member states in December. It is expected to call for significantly bolstering the East StratCom team ahead of the next elections to the European Parliament, this coming spring.
“Mr. President, honourable members, I would like to thank the Parliament for this most urgent plenary debate. We will have elections to the European Parliament in May 2019, and 50 various elections will have taken place by 2020. We have to adjust the protection of the integrity of our elections to the digital age. We have to protect our democratic processes from new ways of manipulation by third countries or private interests.”EU Commissioner Jourová, a Czech national and former MP, addressing a plenary session of the European Parliament in Strasbourg on 14 November, warned that the risk of interference and manipulation into European elections “has never been so high,” stressing that the Cambridge Analytica /Facebook scandal sent shockwaves through democratic systems worldwide about how voters’ personal data and opinions can be so easily manipulated.
The European Commission recently announced a code of practice on disinformation that tech platforms including Facebook, Twitter and Google have said they will abide by. But Commissioner Jourová says that in itself also will never be enough. She is calling for greater funding for the European External Action Service – the EU’s diplomatic corps, which oversees the East StratCom Task Force developing an the action plan – and greater coordination with other bodies working to combat threats hybrid threats.
“Investigations are ongoing into allegations of dark financing from undisclosed, third country sources. The most cited source of activities interfering with elections in Europe is Russia…. Given the very nature of our Union, electoral interference in one Member State affects the EU as a whole. National authorities cannot address these threats by working in isolation, nor can private sector self-regulation solve it all.”
According to Jakub Kalenský, the East StratCom Task Force already cooperates with a volunteer network of some 500 NGOs, diplomats, think-tanks and other media professionals who regularly monitor and forward on examples of disinformation to the team.
“When you imagine that a third of people may believe a version of events that contradicts all the facts, and that someone is playing this audience like a violin, this can really be dangerous – especially when you keep in mind that during elections and referenda, you do not need to persuade 51 percent of the population.”
“When we look at the Brexit referendum, the U.S. presidential elections [in 2016] or even the Czech elections this year, these were all decided by a very small percentage of the population. And if you know how your audience is going to behave, if you can control what information it receives – and if you know that it doesn’t have to have anything to do with reality – this is a very dangerous cocktail.Speaking of the Czech Republic, this country, under the Interior Ministry, has a special centre that is devoted to disinformation and terrorism – and links them, puts them in the same basket, as it were – is this unique to the Czech Republic? And to what degree did you work with this centre?
“It is true that you do not see it extremely often, that disinformation is in one basket with terrorism. It might be unique to the Czech Republic. I believe the most similar organisation I have seen, though it is a much bigger organisation, would be the MSB in Sweden – the Civil Contingencies Agency – which is preparing civil servants, but also politicians, decision-makers, for critical situations, such as disinformation campaigns during elections. But they are also working on protecting soft targets against terrorist attacks.”
“We were certainly in touch – obviously it helped the people in Prague to be in touch with someone in Brussels who can share the experience with them, and for me obviously it was also very useful to have someone here in Prague because otherwise I would rely only on information from the media – which I believe report actually quite well. And I think it is fair to say that the Czech outlets were much quicker in realising that this [disinformation] is a problem than many of the western European outlets. But it’s always better to have the primary source of information.”
Jakub Kalenský left the East StratCom Task Force in November to join the Atlantic Council think tank as a Senior Fellow specialising in pro-Kremlin disinformation campaigns. He will also lead the disinformation aspect of the Ukrainian Election Task Force. Apart from all that – and working to raise awareness of the gravity of the issue especially in countries not on the ‘frontlines’ – he hopes to build additional bridges between governments and civil society groups active in this area.
Although he expects to be working from Prague for about two weeks out of every month, he does not anticipate – nor see the need – to devote more time to the issue in his home country.
“If I evaluate the European countries, I don’t think that the Czech Republic is a country where you need to persuade a lot of people that more needs to be done. I think actually, it is among those states where the response is slightly better than average, and on the working level, I think it’s brilliant, because it’s the Ministry of Interior, the Ministry of Defence, the Ministry of Foreign Affairs, working on this problem.”
“Also the secret services are working on this topic – although I believe they could help the journalists more, for example, when it comes to tracking who is the owner of the [anonymous disinformation site] Aeronet and how they are getting their instructions. When you look for example at Estonia, KAPO, Internal Security Service, are really exposing quite a lot of information and the Czechs could also be doing more. Still, at least on the working level – and on the level of civil society – we do have some NGOs and think-tanks that are working on this topic. We have journalists working on this topic.”“But it is not only these dozens and dozens of disinformation websites but the social media trolls you have mentioned; it’s also the chain emails that not many people in the West no about, but here in this region of Europe, they are extremely effective, because people over 50, 55, do not use social media – these chain emails are their social media.”
“It’s also the Russian state TV, which is crucial for Russian speaking minority – we don’t have so many Russian speakers here in the Czech Republic, but it is a very important audience not only in the Baltic States but also in Germany, as we have seen in the last German elections. So, they are trying as many tools as possible, to have a good chance to achieve the result.”
“We can see that actually the Russians are really good at adjusting the message for the right audience, at finding the right target audience. They do have different messages, not only for different countries but also for different socio-economic groups. They have different messages for a high-ranking diplomat and for a poorly educated senior living in the countryside.”
“The development I see in micro-targeting of the messages is scary. It is fashionable to talk about the so-called deep fakes – the new technology tools that allow you to fake a video, which are becoming so cheap – it will probably be quite easy to use. But I’m a bit more scared about this micro-targeting because it seems to me that the development there is heading where you can have a different disinformation message for so many audiences that we simply won’t be able to catch up.”