“The enemies of liberal democracy hack our feelings of fear and hate and vanity, and then use these feelings to polarize and destroy" - Yuval Noah Harari.
Most of the discussion of the report of the Office of the Special Counsel has, understandably focused on the political implications, and interestingly for a report prompted by foreign intelligence operations, it is not really a counterintelligence report. But Volume I does identify some interesting points about the use of networked information tools for expression that, in many respects, constitutes hostile military/intelligence activity. Or at least, uses which blur the lines between military activity of a foreign government and private activity of foreign persons.
The Internet, when considered under the “place you go” metaphor, has been likened to the Wild West, or an anti-sovereign kind of “place.” But recent events illustrate the value of considering information technology as tools rather than a “place” or even a medium. These events include the intelligence activities carried out by Russian forces during the 2016 election, attacks using tools known as Stuxnet and WannaCry, and attacks by Hamas (an attack to which Israel responded with an air strike, marking perhaps the first armed response to a cyberattack). What these events illustrate is there is no “Wild West” free from the interests or activities of national sovereigns, or their military. There is only the physical space we naturally occupy and electronic tools to send and receive information, for various purposes including warfare, and related intelligence work.
These events also illuminate some important points about, among other things, ongoing challenges with asymmetry in cybersecurity. Asymmetry refers to the fact that attackers need only succeed once, while defenders must succeed at every engagement. With the resources of national militaries behind efforts that already have asymmetric advantages, achieving a desired level of security becomes more difficult and more expensive. And access to information tools that can be used with malice is much simpler than access to physical, chemical or biological weapons.
It’s encouraging that I’ve seen recent usage of the term “cybertools.” I don’t know, unfortunately, who to credit for this coinage, but it has appeared in articles on National Security Agency hacking tools, government technology for cybersecurity,  and use of AI and the Internet for military offensive efforts. Considering technologies as “tools,” that can be wielded by anyone, military forces included, produces a better mental picture, one more likely to prompt understanding of the appropriate urgency.
A “place” connotes something with boundaries, and possibly with some organizing principles or shared expectations of behavior. “Tools,” in contrast, prompts an understanding that there are no inherent limits on their behavior other than those imposed by their designers and their users.
The issues concern not just “cyberwarfare,” but also “information warfare” – a battle for dominance, or at least security and stability, among ideas and information. I’m grateful for a Lawfare article sent by a friend that described a “Jesuitical” debate within the Air Force in the 1990’s as to whether networked information technologies required a “cyber” strategy or an “information” strategy. That is, should the military focus on the security of private information systems or the security of public information systems?
Hint: it's both, but the Air Force (and others) were slow to catch on to the latter strategy. Enemies, of course, also saw the potential for use of public information systems. And as that article notes, that is the topic of a recent book by Peter Singer and Emerson Brooking, who observed that “social media is being mobilized as an adjunct to kinetic combat.”
To be sure, the article also notes military and political efforts aimed at disinformation and disruption of public opinion are not new. During the Revolutionary War, Benjamin Franklin created “fake news” intended to undermine British public support for colonial governance. “Disinformation” campaigns are features of Cold War espionage, as well as fictitious treatments of espionage efforts.
In the political context, the election between Thomas Jefferson and John Adams was peppered with scandalous and often false rumor-mongering in an attempt to win a battle of information and ideas. More recently, political operative Dick Tuck’s stunts – fake mimeos claiming candidate Barry Goldwater was upset at fluoride in his water – were a fairly benign form of political misinformation campaign, but Tuck’s pushing of the envelope did play a role in encouraging others to take up more aggressive misinformation endeavors.
These political efforts, like the Russian activity in the 2016 election, show the challenges of asymmetry in the information warfare area. It takes only one person to publish a false narrative that becomes widely spread and changes the collective opinion for the worse. As Churchill famously said, "a lie gets halfway around the world before the truth has a chance to get its pants on.
The nature of information warfare also illuminates what may be an under-appreciated fact about intelligence work, which is that much of it presents as ordinary and benign activity. Attending a business conference, touring a factory, supporting a political cause, all of which are valuable (and ordinarily 1st Amendment protected) activities, can be reconnaissance and/or influence opportunities in an information war. The sentencing memo in the recent case of Russian operative Maria Butina illustrates plainly why ordinary activity can be of counter-intelligence concern.21]
One practical conclusion from this analysis is security of our news and media information systems matters as much as security of personal and commercial information systems. Information warfare shows harms can arise even when there is no unauthorized access, when the tools are used as intended, and when there’s no compromise of user privacy settings. In both cases, the threats are asymmetric, the tools readily available, useful for many productive and positive purposes, and threats are easily disguised as benign, whether in the form of a business conference used for economic espionage, or a product order form email used to inject malware.
Fortunately, there are variety of interesting proposals to respond to information warfare. Social media platforms have experimented with novel ways of ‘fact checking,’ – putting information in context as reputable journalists attempt to do. Cass Sunstein proposed application of libel law concepts. Professor Philip Howard is, among others, doing important work to understand the role of automated information tools (e.g., “bots”) in information campaigns.
These are important topics of conversation because there are few easy answers. Networked information tools are still important for those who seek to advance human rights, freedom and democracy, notwithstanding their use against such causes, and because even well-meaning responses to information warfare can threaten the very rights and democratic values they seek to protect.
My own contribution, as I’ve noted in this blog several times, is the most reliable forms of defense involve critical and informed users. Robust resistance to phishing and social engineering, as well as defenses against “fake news” and disinformation campaigns, is unlikely to be fully achieved on the basis of refinements to the tools (including information platforms). It will require designers and users to, to the extent possible, acquire greater understanding of themselves as sorters and interpreters of information.
Indeed, ideally, designers and users of tools would develop a stronger sense of themselves as the source of experience and thus of knowledge. Effective responses to cyberattacks and “information warfare” are thus very much the province of psychologists, as well as technologists and legal experts. If there is a territory to explore in addressing issues of information warfare, let it be the territory of the mind.
Read more at Woods LLP
Licensed from https://cyberlaw.stanford.edu/blog/2019/06/tool-without-handle-guerilla-information-warfare