23 April 2018

The {Cyber} Guns of August

Michael Senft

"History doesn't repeat itself but it often rhymes.”

-- Mark Twain

“Why did the lessons of Stuxnet, Wannacry, Heartbleed and Shamoon go unheeded?” asked the inquisitive student to the doleful professor, whose withered, prematurely-aged face bore witness to the shattering of a hyperconnected world. Today students ask the same questions about the Russo-Japanese War and the Spanish Civil War. Voluminous accounts detailed the terrible lethality of modern weaponry at the Siege of Port Arthur and the Battle of Mukden, which foretold the unimaginable bloodshed of the First World War. [1] Likewise, the Spanish Civil War was a harbinger of blitzkrieg warfare and the unspeakable carnage unleashed during the Second World War. [2,3,4] Despite insightful analysis and almost clairvoyant assessments, the lessons from both conflicts were largely ignored as they ran counter to prevailing views, established organizational structures and pre-ordained plans. Are we any different today?

The Shadow of Stuxnet

“Cyber activity is a means to an end for each adversary.”

-- Benjamin Runkle [5]

It may be a shock to some, but computer systems and software are less secure today than they were in the 1970’s, due to exponential increases in complexity. [6] More complex systems have more lines of code and more interactions within the code, which increase probability of security vulnerabilities. [7] Pervasive interconnectivity combined with complex, systemic dependency on information technology systems, mixed with compressed technology development cycles and ironically cyber security research have converged to create a tinderbox of volatility. A mere spark has the potential to transform this tinderbox into a global conflagration. Although the world today is vastly different than that of that of 1914, contemporary geo-political events continue to foster a similar fragile environment pre-disposed to a “Guns of August” event, where offensive cyber activity initiates a chain reaction of escalation that shatters our hyperconnected world. The probability of a cyber “Guns of August” event increases exponentially as systemic technical and operational vulnerabilities emerge from the shadows of pervasive complacency and neglect spanning decades. The proliferation of poorly security Internet of Things (IoT) devices, provide a rapidly recruitable bot-army ready to execute cyber-attacks at a moment’s notice. [8] Speed of execution, technical abstraction and the complexities of attribution provides fertile ground for misconceptions, miscalculations and mistakes in responding to offensive cyber activity. [9] As highlighted in Attributing Cyber Attacks,“governments get to decide how to do attribution, and they get to decide when attribution is good enough for action.” [10]

Given the considerable challenges of attribution and potential impact of offensive cyber activity, even a small-scale cyber-attack by nation-state or non-state actors can sow fertile ground for misconceptions, miscalculations and mistakes, which could lead to responsive actions taken by one or more impacted countries. It is easy to forget the First World War did not start immediately after the assassination of Archduke Franz Ferdinand, but rather weeks later following rejected ultimatums from Austria-Hungary for Serbia to submit to several unpalatable demands and demands by Germany for Russia to halt their mobilization of military forces. [11] A scenario where one nation demands another to hand over an individual or group of hackers identified as being responsible for a specific cyber-attack is quite plausible, as is the prospect this demand being refused for political or other reasons.

Today potent cyber-attack capabilities can be easily acquired from various legal and illegal marketplaces, much like the bomb thrown at Archduke Ferdinand’s vehicle and the pistol that fatally wounded him were readily available in 1914. [12] The secretive nature of these offensive cyber capabilities has created a global digital arms race in the shadows of cyberspace. Cyberspace provides an environment where the attacker possesses the initiative and maintains a distinct advantage over the defender. As a result, the inflexible mobilization timetables of the First World War have been replaced by the tyranny of microseconds to execute a first strike in order to retain the initiative.

Cult of the Offense

“The best defense is a good offense.”

-- Jack Dempsey

While countries publicly denounce offensive cyber activity, espionage executed via cyber means is treated as an acceptable state behavior. Existing international law carried over to cyberspace allows that “responsible nations may carry out cyber espionage (violating a system’s confidentiality), but they may not carry out cyber-attacks (operations that violate a system’s integrity or availability)”. [13] Even though organizations such as the United Nations urge global rules for cyber warfare, the ability to violate a system’s confidentiality in peacetime under the guise of espionage, all but ensures the digital arms race will continue to accelerate since violation of a system’s confidentiality also provides, in most cases, the ability to violate that system’s integrity and/or availability. [14]

This paradigm has created the cult of the offense in cyberspace. The French cult of the offense drove French strategy heading into World War 1, which was codified in Plan 17. Plan 17, focused solely on offense because “the offensive alone is suited to the temperament of French soldiers.” [11] Despite having obtained an early version of the Schlieffen plan in 1904, French forces were deployed provide superior numbers against German forces in Alsace and Lorraine instead of defending against the main German thrust through Belgium. Limited French success in Alsace and Lorraine quickly became irrelevant as German forces rapidly outflanked French forces deployed on their shared border.

The contemporary cult of the offense is reluctant to acknowledge that the Internet has created an exposed, high-speed, low-cost attack vector against U.S critical infrastructure and key resources, which is vulnerable to a wide range of adversaries seeking to harm U.S. interests. The authors of Plan 17 fixated their hopes, strategy and training on the offense at the expense of defensive operations, refusing to acknowledge inconvenient realities that ran counter to their preordained plans.

We Slept at Dawn

"Everyone has a plan until they get punched in the mouth.”

-- Mike Tyson

Cyber-attacks are enabled through the exploitation of vulnerabilities, not through generation of force. [15] The investment in defensive cyber operations has been considerable, but defensive concerns continue to be secondary to ensuring the availability of IT systems and offensive cyber operations remain the priority for resources. This despite overwhelming evidence that State and nonstate actors are focused on disrupting U.S. advantages in communications. [16]

The investments in defensive cyberspace operations have been centered on defense-in-depth, where multiple layers of security controls are included within an IT system. Defense-in-depth is good in theory, but it provides little protection when a system can suffer a total compromise from a single unknown attack. [6] Security measures Apple and Google implement in their mobile operating systems on phones and tablets are meaningless if an adversary can gain access to the system kernel via a vulnerability in one of the many sub-components of the phone such as the wi-fi or Bluetooth controller. [17] Both our hardware and software are Frankenstein-like creations built from sub-components sourced from a myriad of Original Equipment Manufacturers (OEM) operating on razor thin profit margins and millions of lines of code, often kludged together as quickly as possible to sell to consumers, with security a secondary concern. [18,19,20] Even more insidious are fundamental hardware design vulnerabilities such as Meltdown and Spectre, which existed unknown for decades. [21,22] The likelihood of future unknown attacks capable of a complete system compromise brings into question the wisdom of a defense-in-depth focus.

The forts of Liege and Namur were the epitome of defense-in-depth in 1914, with disappearing gun turrets, interlocking fields of fire and 30-foot deep moats. [23] Designed to hold out for weeks under continuous attack, these fortifications ­fell in two days, victims of a siege attack by previously unknown gigantic cannons. While the outcome of an unknown cyber-attack lacks the visceral impact of a 16.5 inch shell, their effect is the same, neutralizing defenses to enable swift exploitation.

In his presentation at USENIX 2016 titled “Disrupting Nation State Hackers,” Rob Joyce, the former Chief of Tailored Access Operations, National Security Agency highlighted "If you really want to protect your network, you really have to know your network." [24] How many organizations put in the time to know their network better than the people who designed it and the people who are securing it? Compare this with the level of effort being applied to obtain the same level of knowledge of target networks for offensive purposes.

John Boyd – Reloaded


“He who can handle the quickest rate of change survives.”

-- John Boyd

Observe, Orient, Decide, and Act. The OODA loop is the refinement of John Boyd’s voluminous works on understanding conflict and strategy. Regardless of warfighting domain, the combatant able to consistently execute the OODA loop more quickly gains a decided advantage as their adversary’s decision-making process becomes overwhelmed by continuous change. [25] In Patterns of Conflict, Boyd highlighted the initial success of Blitzkrieg at the start of the Second World War was enabled by mission-focused Auftragstaktik that provided subordinates freedom of action resulting in “many (fast-breaking) simultaneous and sequential happenings to generate confusion and disorder”. [26, 27] The importance of Auftragstaktik cannot be understated as it enabled German forces to get inside the French OODA loop, magnifying friction and producing paralysis which ultimately brought about French collapse, despite the French possessing the advantage in number and quality of tanks and extensive fortifications.

Patterns of Conflict also provides a useful template to analyze cyberspace conflict. At a strategic level, applying the OODA loop to defense of U.S critical infrastructure and national security systems exposes a complex labyrinth of legal authorities and byzantine Cold War era command and control structures. Cold War era assumptions and timelines raise questions on the ability of the U.S. government to effectively coordinate response and execute recovery actions to address a wide-scale cyber-attack and the cascading effects that would result. At the operational and tactical levels, questions are raised on the ability of DoD and other government agencies, operating under different authorities, chains of command, and interconnected networks configured with different cybersecurity systems, process and procedures, to effectively execute the OODA loop rapidly enough to adequately respond to adversary activity. The challenges of rapidly and accurately executing the OODA loop in cyberspace provides fertile ground for the same misconceptions, miscalculations and mistakes that sounded the guns of August.

Lessons from history highlight the importance of executing the OODA loop more rapidly than an adversary, regardless of the domain. Cyberspace is the zenith of a rapidly changing warfighting domain, which demands fully knowing and understanding the devices, security and technologies within our information technology networks to rapidly observe adversary cyber activity. Likewise, rapidly feeding, processing and fusing this information into accurate orientation for decision making requires adaptive, agile and efficient processes and procedures.

The 2018 National Defense Strategy recognizes the need to evolve innovative operations concepts, which is critically needed in the execution of cyberspace operations. [28] Rapidly executing the OODA loop for cyberspace operations requires a paradigm shift in the command and control of cyberspace operations, transitioning from a traditional top down hierarchy, to a more agile and adaptive construct grounded in disciplined initiative. One such concept is a wirearchy, defined as a “dynamic two-way flow of power and authority based on knowledge, trust, credibility and a focus on results.” [29] While the concept of a wirearchy seems radical, so was the widespread use of radios to execute military operations at the start of the Second World War. Pre-World War Two French tactical focus emphasized conformance to planned maneuvers, which resulted in only twenty percent of their tank force being fielded with radios. This is in stark contrast to the near-universal fielding of radios to German tanks prior to the start of the Second World War that enabled Auftragstaktik and Blitzkreig. [30] Established organizational structures and pre-ordained plans within the French military derailed any attempts to field radios to tanks more widely, until it was too late. The French experience during the First and Second World War are lessons that overemphasis on either the offense or defense can result in strategic defeat, while highlighting the importance of agile and adaptive command and control structures capable of rapidly executing the OODA loop. Only time will tell if operational concepts in organizing and employing cyber forces are able to evolve to meet the challenges of future conflict, or if these innovative operational concepts are stifled because they ran counter to prevailing views, established organizational structures and pre-ordained plans.

Reflections on Trusting Trust

“You can't trust code that you did not totally create yourself.”

-- Ken Thompson

“It is difficult to get a man to understand something, when his salary depends upon his not understanding it” [31] the professor wryly replied. “Who really benefits from secure computers?” he said with a morose smile. “Consumers get less expensive systems and software with more features, companies are able to sell anti-virus and other cyber security products and services and governments are able to more easily execute espionage and mass surveillance operations. Stuxnet, Wannacry, Heartbleed, Shamoon, these were all warning shots that were largely ignored because resolving their root cause within an open Internet ecosystem required expensive, disruptive and problematic solutions involving tradeoffs between security, privacy and availability.” Perhaps only the trauma of catastrophe can inflame the paradigm shift necessary to overcome a half-century of cyber security complacency and embrace the difficult efforts required to strike a balance between security and availability. 30 days sealed the fate of a generation a century ago. Perhaps only 30 milliseconds will be needed to seal the fate of another.

The views and opinions expressed are those of the author and not necessarily the positions of the U.S. Army, Department of Defense, or the U.S. Government.

References

[1] Sisemore, J. (2003). The Russo-Japanese War, Lessons Not Learned. US Army Command and General Staff College. Retrieved from http://www.dtic.mil/cgi-bin/GetTRDoc?AD=ADA430841

[2] Oppenheimer, P. (1986). From the Spanish Civil War to the Fall of France: Luftwaffe Lessons Learned and Applied. Institute for History. Retrieved from http://www.ihr.org/jhr/v07/v07p133_Oppenheimer.html

[3] Corum, J. (1998). The Spanish Civil War: Lessons Learned and Not Learned by the Great Powers. The Journal of Military History. Retrieved from www.jstor.org/stable/120719

[4] Pisano, D. (2013). American Military Aviation in the Interwar Years and After: Some Historical Reappraisals.Smithsonian National Air and Space Museum. Retrieved from https://airandspace.si.edu/stories/editorial/american-military-aviation-interwar-years-and-after-some-historical-reappraisals

[5] Runkle, B. (2016). The Best Strategy for Cyber Conflict May Not Be a Cyber Strategy. Retrieved fromhttps://warontherocks.com/2016/11/the-best-strategy-for-cyber-conflict-may-not-be-a-cyber-strategy/

[6] Garfinkel, S. (2016). The Cybersecurity Mess. Retrieved from http://simson.net/ref/2016/2016-12-14_Cybersecurity.pdf

[7] McCabe Software, Inc. (n.d.). More Complex = Less Secure. Retrieved from


[8] Greenberg, A. (2017). The Reaper IOT Botnet Has Already Infected a Million Networks. Retrieved fromhttps://www.wired.com/story/reaper-iot-botnet-infected-million-networks/

[9] Davis, J. (2017). Stateless Attribution: Toward International Accountability in Cyberspace. Santa Monica, CA: Rand Corporation. Retrieved fromhttps://www.rand.org/content/dam/rand/pubs/research_reports/RR2000/RR2081/RAND_RR2081.pdf

[10] Rid, T. & Buchanan, B. (2015). Attributing Cyber Attacks, Journal of Strategic Studies, 38:1-2, 4-37. Retrieved from http://dx.doi.org/10.1080/01402390.2014.977382

[11] Tuchman, B. (1962). The Guns of August. London: Penguin, 2014. Print.

[12] The Economist. (2013). The Digital Arms Trade. The Economist (US). Retrieved fromhttps://www.economist.com/news/business/21574478-market-software-helps-hackers-penetrate-computer-systems-digital-arms-trade

[13] Libicki, M. (2017). The Coming of Cyber Espionage Norms. 2017 9th International Conference on Cyber Conflict (CyCon). Retrieved fromhttps://ccdcoe.org/sites/default/files/multimedia/pdf/Art%2001%20The%20Coming%20of%20Cyber%20Espionage%20Norms.pdf

[14] Khalip, A. (2018). U.N. Chief Urges Global Rules for Cyber Warfare. Retrieved fromhttps://www.reuters.com/article/us-un-guterres-cyber/u-n-chief-urges-global-rules-for-cyber-warfare-idUSKCN1G31Q4

[15] Libicki, M. (2009). Cyberdeterrence and Cyberwar. RAND Corporation. Retrieved fromhttps://www.rand.org/content/dam/rand/pubs/monographs/2009/RAND_MG877.pdf

[16] U.S. Department of the Army. (2014). The Army Operating Concept, Win in a Complex World. TRADOC Pamphlet 525-3-1. Retrieved from http://www.tradoc.army.mil/tpubs/pams/tp525-3-1.pdf .

[17] Greenberg, A. (2017). How a Bug in an Obscure Chip Exposed a Billion Smartphones to Hackers. WIRED. Retrieved from https://www.wired.com/story/broadpwn-wi-fi-vulnerability-ios-android/

[18] Mishkin, S., & Palmer, M. (2012). Foxconn survives on thin slices of Apple. Financial Times. Retrieved fromhttps://www.ft.com/content/170a225c-0356-11e2-a284-00144feabdc0

[19] Schneier, B. (2014). The Internet of Things is Wildly Insecure – And Often Unpatchable. WIRED. Retrieved from https://www.wired.com/2014/01/theres-no-good-way-to-patch-the-internet-of-things-and-thats-a-huge-problem/

[20] The Economist. (2017). Why Everything is Hackable. The Economist (US). Retrieved fromhttps://www.economist.com/news/science-and-technology/21720268-consequences-pile-up-things-are-starting-improve-computer-security

[21] Project Zero. (2018). Reading Privileged Memory with a Side-Channel. Retrieved fromhttps://googleprojectzero.blogspot.com/2018/01/reading-privileged-memory-with-side.html

[22] McLellan, P. (2018). Why You Shouldn’t Trust Ken Thompson. Retrieved fromhttps://community.cadence.com/cadence_blogs_8/b/breakfast-bytes/posts/why-you-shouldn-t-trust-ken-thompson

[23] Battle of Liรจge. (2018). Wikipedia. Retrieved fromhttps://en.wikipedia.org/wiki/Battle_of_Li%C3%A8ge#The_Fortified_Position_of_Li%C3%A8ge

[24] Joyce, R. (2016). Disrupting Nation State Hackers. USENIX 2016 Presentation. Retrieved fromhttps://www.usenix.org/conference/enigma2016/conference-program/presentation/joyce

[25] Osinga, F. (2007). Science, Strategy and War: The Strategic Theory of John Boyd. London: Routledge.

[26] Boyd J. (1986). Patterns of Conflict. Retrieved from http://www.dnipogo.org/boyd/pdf/poc.pdf

[27] Widder, W. (2002). Auftragstaktik and Innere Fรผhrung. Military Review. Retrieved fromhttp://www.ramblemuse.com/rmtp/wp-content/uploads/2010/06/Widder_2002_Auftragstaktik_MilRevr.pdf

[28] United States Department of Defense. (2018). Summary of the 2018 National Defense Strategy of the United States of America. Retrieved from https://www.defense.gov/Portals/1/Documents/pubs/2018-National-Defense-Strategy-Summary.pdf

[29] Wirearchy. (2018). Retrived from http://wirearchy.com/what-is-wirearchy/

[30] Larew, K. (2005). "From Pigeons to Crystals: the development of radio communication in U.S. Army tanks in World War II”. The Historian

[31] Sinclair, U. (1994). I, candidate for governor: and how I got licked. Berkeley: University of California Press.

No comments: