24 July 2017

Is the Strategic Corporal on Your Twitter Feed?



Is the Strategic Corporal on Your Twitter Feed? 

Understanding the levels of conflict—strategy, operations, and tactics—can help policymakers mitigate the effectiveness of a disinformation campaign. Russia's operation against the 2016 U.S. election is a case in point. 

Melissa K Griffith is a graduate student at the University of California, Berkeley, and Trey Herr is a post-doctoral fellow with the Belfer Center’s Cyber Security Project at the Harvard Kennedy School of Government.

When did your Facebook page become a weapon? The dissemination of targeted information and propaganda has been an enduring characteristic of local and international politics, yet Western discussions of cybersecurity have historically given it comparatively little emphasis. This changed with Russian efforts to undermine the legitimacy of the 2016 U.S. election. These efforts were neither the first nor the last but certainly one of the most dramatic instances of information as a tool of conflict in recent memory.

In trying to structure an evaluation of information operations like Russia’s work to undermine confidence in the recent U.S. presidential election, analysts face the challenge of the strategic corporal in a more dramatic fashion: tactical behaviors can rapidly have strategic effects. Much as the actions of a corporal interacting with an angry crowd can resonate across oceans, a few users reposting on Twitter can spiral into a campaign of rumor and falsehood that shakes the faith of an electorate.

How then do analysts distinguish between levels of war in information operations and appropriately categorize means and effects? The traditional paradigm considers three levels—strategic, operational, and tactical. The strategic is the realm of state power: diplomatic tools mated with military force to advance national priorities. The operational level of war considers the combined engagements of an ongoing campaign, along with the logistics and planning supporting them (e.g. how an air tasking order is generated to manage the flights of bombers and refueling aircraft). The tactical level, in contrast, is concerned with the particularities of a single engagement (e.g. whether the caliber of an armored vehicle’s main gun can penetrate a certain thickness of armor).

Considering traditional conflict, these levels of war are debated but largely clear, a product of hundreds of years of martial tradition and scholarship. In cyber conflict these levels are less clear. This was a subject of extensive discussion several weeks ago at the State of the Field of Cyber Conflict conference hosted by Columbia University. Why then apply the levels of war in cyber conflict? They provide an important means of parsing strategic purpose from the operational and tactical activities supporting it. Russian efforts to undermine the 2016 U.S. election generated a strategic shock, one which resonated throughout the electoral process and preyed on a tremendous rural/urban divide in the U.S. electorate. Responses spanned the gamut, from demandingsocial media companies like Facebook and Twitter do more to prevent the spread of disinformation to calls that the United States send more arms to the Ukrainians and “fry Russian computers”. A clearer categorization would be useful, to distinguish between the Russian strategy, the operations carried out to achieve that strategy, and the tactics deployed within those operations.

Strategic Information

The Russian strategy appears to have centered on the twin goalsof shifting public opinion against Clinton and undermining American confidence in the credibility of the democratic process. There is evidence that part of the underlying motivation for driving negative public opinion was retribution against Clintonfor what Putin viewed as her role in fueling massive protests against the Kremlin in 2011. More broadly, efforts to undermine confidence in the electoral process also held potential to impact future elections and fragment U.S. politics, paralyzing decision-making.

Operations of Influence

Russia pursued a multi-faceted campaign to achieve these two strategic goals, running a massive espionage and targeted leak campaign in parallel with an operation to compromise local voter records infrastructure and another to sow disinformation and spread rumors. The first was comprised of espionage efforts against U.S. political organizations. The dissemination of this information was tailored in terms of the content, the timing, and the target population. Russia’s use of the Guccifer 2.0 personality as a shill and collaboration with Wikileaks to disseminate materials from the Democratic National Committee (DNC) typified a tendency to use third parties and invented personas as the public face of these efforts.

A parallel Russian operation focused on compromising local election boards and voter registration databases. There is no evidence in the public domain that we are aware of that these compromises or any access obtained through them had an impact on vote tallies at the state or local level.

A third operation focused largely on propaganda. This was a joint effort between Russian state-sponsored media and a “network of quasi-government trolls” in the words of the U.S. intelligence community operating online. Together they created and disseminated content to domestic and international audiences. Examples of this can be found in the consistently negative coverage of Clinton in Russia Today (RT) alongside the proliferation of fake news on social media platforms like Facebook and Twitter.

Tactics in Information Warfare

To achieve the strategic effects these operations sought to achieve, Russia utilized a variety of tactics. Within its propaganda operation, Russia leveraged its network of quasi-government trolls, supplying them with content and targeting information on potentially receptive users on different social media platforms. Organizations in the Russian government, in conjunction with a network of outside groups, utilized targeted data collection to acquire personal information from users on social media to tailor content to particular demographics, create fake accounts, and alter or take over existing accounts. Using this information, these groups began to create new accounts and content including fake news stories and highly viral memes. Distributing content to an army of trolls, the Russians could amplify (falsely) the perceived attention on these stories and memes, using newly created accounts to share, comment on, or ‘like’ disinformation through social media platforms. These efforts were amplified and aided by real users commenting on, liking, and sharing these stories themselves.

Taken together, a levels of war analysis highlights how the tools and goals of information operations largely differ at each level. At the tactical level, Facebook has published a White Paper oninformation operations detailing the tactics deployed on their platform and how they are building in tactical responses to mitigate the effectiveness of these types of tactics in the future. At the operational level, there has been little serious effort made to moderate the pace of information flow in the media cycle while groups clearly recognize the threat of becoming a useful fool by repeating rumor and speculation without verification.

Using levels of war to parse information operations like Russian attempts to influence the 2016 U.S. election and the 2017 French election can help researchers and practitioners alike. It shapes the ways both groups understand events and calculate responses. It highlights how in cyberspace facilitated information operations tactical behaviors can have strategic outcomes, impacting the broader military and political environment. While Facebook may very well be on the front lines of information warfare at the tactical level, it does not necessarily follow that they are or should be on the front lines determining broader US strategy. Understanding these differences, systematically, can help drive new priorities in research and sharpen policymakers grasp of the options arrayed before them in response.

No comments: