NATO Says Russian “Robotrolling" is Becoming More Sophisticated, but the West is Not Giving Up

The West Takes Aim at Putin’s Fake News Machine

NATO Says Russian “Robotrolling" is Becoming More Sophisticated, but the West is Not Giving Up

[caption id="attachment_55254501" align="aligncenter" width="4957"] A picture taken on October 17, 2016 shows an employee walking behind a glass wall with machine coding symbols at the headquarters of Internet security giant Kaspersky in Moscow. / AFP / Kirill KUDRYAVTSEV / TO GO WITH AFP STORY BY Thibault MARCHAND (Photo credit should read KIRILL KUDRYAVTSEV/AFP/Getty Images)[/caption]


by Maia Otarashvili*

At this point hacking and disinformation is a tried and true hybrid warfare tool for Russia. Its fake news propaganda in the West is going strong, and what’s worse, it is also becoming more sophisticated. While the uncertainty of the Trump administration’s Russia policy continues to alarm the observers, the United States Congress has made it very clear that it intends to punish the Putin government for its actions against the U.S. and its European allies. Moreover, NATO continues to take Russian disinformation efforts very seriously. The U.S. and its allies may not have found the perfect solution to successfully combatting the Russian disinformation campaigns yet, but recent developments, in terms of state-sponsored and NGO-led anti-fake news efforts, give reason for optimism.

NATO TRACKS RUSSIAN “ROBOTROLLING”



The Baltic states of Estonia, Latvia, and Lithuania, together with neighboring Poland, have found themselves under exceptionally high pressures emanating from Russia in the form of disinformation. The area, with its high numbers of ethnic Russians and close proximity to Russia, is a prime target for the spread of fake news. However, the Baltic states are also EU and NATO member states, which means that NATO resources can be used for studying and combating disinformation in the Baltics. A NATO-sponsored study recently found that Russian disinformation efforts in the Baltics are becoming more sophisticated. The research, which refers to Russian disinformation propaganda on social media as “robotrolling,” has yielded some astounding results.

According to the report, two in three Twitter users who write in Russian about the NATO presence in Eastern Europe are “bot” accounts (robots), rather than human users. The bot accounts make up 84 percent of the total Russian-language messages. With respect to the English-language space one out of four active accounts, or 46 percent of its content, were found to be bot-generated.

Alarmed by the research results, the Latvian foreign minister, Edgars Rinkēvičs, recently said that the West is failing to come to grips with Russian hacking and fake news. Rinkēvičs is correct: the West has not been swift in responding to Russian robotrolling with adequate force or speed — but the situation has begun to improve.

MAJOR CASH INFUSION FOR US ANTI-CYBERTERRORISM EFFORTS



In spring 2016, the US government created an interagency unit called the Global Engagement Center, which was to replace the Center for Strategic Counterterrorism Communications. The Center is based at the State Department, and its staff of approximately 80 people coordinates the government’s efforts to counter cyber-terrorism. The initial purpose of the Center was to counter the ISIS’s efforts online. However, as U.S. national security priorities evolved throughout 2016, the Center’s mission expanded to include state-sponsored disinformation campaigns emanating from China, North Korea, and Russia. Congress allocated $ 80 million to fund the Global Engagement Center’s efforts. However, the Center’s new efforts in terms of combating Russian propaganda have not yet begun in earnest: the new Secretary of State, Rex Tillerson, has been slow to request the funds for the Center’s use; they had been temporarily parked at the Pentagon and required a formal request from the Secretary of State in order to be transferred to the Global Engagement Center. As Politico reported this summer, Tillerson initially resisted the pleas of State Department officials to spend the money once he had acquired it. Another Politico article explained that Tillerson faced criticism from angry lawmakers on both sides of the aisle over his unwillingness to spend the money in order to allow the Global Engagement Center to do its work.

In early September, however, State Department officials confirmed that Tillerson had finally requested a transfer of $ 40 million from the Pentagon to the Center. The nature of the Center’s anti-propaganda efforts and programs will become clearer in coming months, but the fact that it has finally received the funds it needed comes as welcome news with respect to the West’s losing battle with Russian disinformation.

The European External Action Service East Stratcom Task Force is the EU counterpart to the US Global Engagement Center. The Task Force has been studying Russian disinformation since 2015 and just released the first searchable database of several thousand examples of disinformation. Its goal is to allow the public to fact-check what they read online.

These are just some of the examples of major state-sponsored anti-disinformation efforts underway. Their potential for effectiveness continues to be scrutinized by experts, who often point to the experimental nature of these efforts on the tax-payers’ dime. This makes the anti-propaganda efforts of educational institutions, NGOs and civil society organizations that much more important.

[caption id="attachment_55254504" align="aligncenter" width="900"] Exercises on cyberwarfare and security are seen taking place during the NATO CWIX interoperability exercise n 22 June, 2017 in Bydgoszcz, Poland. (Photo by Jaap Arriens/NurPhoto)[/caption]

NON-STATE ACTORS AGAINST FAKE NEWS



Other components to successfully combat fake news involve non-state actors playing a prominent role. According to Alexandra Sarlo, a Fellow at the U.S.-based Foreign Policy Research Institute, scholars, non-profit organizations, and educators also have a major responsibility:

“Encouraging independent media and thoughtful integration of Russian-language programming into mainstream sources will provide more credible alternatives for Baltic Russian speakers. In the longer term, an important tool for all countries facing propaganda and “fake news” is to increase education in media literacy, critical reading, and technical training to thwart hacks and other attempts to hijack information. A population trained to identify bias is the best defense against harmful propaganda.”

To this end, the work of the Alliance for Securing Democracy is very significant. A bipartisan, trans-Atlantic initiative housed at The German Marshall Fund of the United States (GMF). It is developing comprehensive strategies to defend against, deter, and raise the costs to Russian and other state actors’ efforts to undermine democracy and democratic institutions. One of its missions is “to publicly document and expose Vladimir Putin’s ongoing efforts to subvert democracy in the United States and Europe.”



The Alliance’s work is already coming to fruition. It recently launched the Hamilton 68 and Artikel 38 dashboards. These are innovative tools that “expose the effects of online influence networks and inform the public of themes and content being promoted to Americans by foreign powers.” The dashboards currently operate in English and German. They are updated live, and the format is interactive. Charts and graphs on the webpage display hashtags, topics, and URLs promoted by Russia-linked influence networks on Twitter. According to the dashboard’s creators, “The content it analyzes is not necessarily produced or created by Russian government operatives, although that is sometimes the case. Instead, the network often opportunistically amplifies content created by third parties not directly linked to Russia. Common themes for amplification include content attacking the U.S. and Europe, conspiracy theories, and disinformation. Russian influence operations also frequently promote extremism and divisive politics in Western countries.” The dashboard also lists top domains that amplify these messages, making it a useful tool for everyday users to find out whether or not what they are reading is part of a fake news propaganda campaign.

These dashboards are set to make significant contributions toward fostering greater media literacy among English-speaking and German-speaking societies, or as Sarlo describes it, “a population trained to identify bias.”


LOOKING AHEAD



Unfortunately, the existing plans to combat Russian fake-news propaganda and robotrolling do not offer an immediate solution to the problem. For Western experts, it is much easier to deal with English-language robotrolling — but when it comes to addressing the Russian-language social media space, the West lacks resources and expertise. According to NATO’s robotrolling report, the English-language bot “population” accounts for approximately 28 percent of the Twitter users surveyed. That number almost triples when it comes to the Russian-language bots, at 70 percent. Yet most of the Western anti-robotrolling efforts target the English-language bots and fake news.

“The social media platform, Twitter, must bear some responsibility. Our impression is that non-English spaces are policed much less effectively, resulting in the toleration of behavior patterns that would normally result in account suspension. This is problematic: Around the world, authoritarian states coerce domestic media into compliance. Social media can offer citizens an alternative space to express their news. Twitter’s ability to serve this function is compromised if the volume of fake activity outweighs genuine content.”

This conclusion is a much needed reminder that governments alone cannot be responsible for combating Russian robotrolling and fake news. Media companies like Twitter must also step up their efforts and take responsibility for their own shortcomings, particularly when their core competencies are being compromised. As the NATO report states, if the content on Twitter, particularly the non-English-language content, becomes largely or even predominantly fake and/or bot-generated, Twitter’s goal of serving as an open space for people to express their views and access information will fail.

*Maia Otarashvili is Research Fellow and Program Manager of the Eurasia Program at the Foreign Policy Research Institute in Philadelphia. She holds an MA in Globalization, Development, and Transitions from the University of Westminster in London, UK. Her current research is focused on the post-communist countries of the Eurasia region, including the Black Sea and Caucasus states.

font change