Welcome to ePrints Soton - ePrints Soton



The Roots of the United States’ Cyber (In)Security*

Christopher Fuller

In June 1983, a Harvard study on the impact of computers upon American political campaigns concluded that the technology’s penetration into the world of elections had brought significant advantages for candidates. Alongside their use for managing information bases, assisting in accounting and budgeting, and lightening the load of basic office management, the report identified “intra-campaign communication” and “nearly instantaneous [ . . . ] transfer of information among the widely dispersed staff” as a particular advantage.[1] For the first time, press releases, speech drafts, and schedules could be composed, revised, accessed and printed via remote terminals. While the report identified that the Republican National Committee (RNC) held a significant advantage over their Democratic National Committee (DNC) opponents in the adoption and application of this new approach, it noted the Democrats intense efforts to catch up and duplicate the Republicans, revealing the birth of computer-driven campaign management.

Barack Obama’s 2008 election victory marked the maturation of the emergent techniques discussed in the 1983 report, as well as the DNC’s eclipsing of their Republican rival’s use of digital campaigning. The continued development of computational power, wireless networks, and the rapidly expanding commercial market for consumer data made it increasingly possible for tech-savvy campaigns to forecast which citizens would be likely supporters, and to target them.[2] For-profit companies, such as the California-based Catalist, had orchestrated huge data-driven psychographic voter targeting operations on behalf of the DNC and Obama campaign, helping register new supporters, raise vast sums via small online donations, and bring out the vote come election day.[3] In addition to demonstrating the power of computers in empowering campaigns, the 2008 election hinted at a new danger their embedded use could pose. 1983’s Harvard study had made no mention of the risk of digital information leakage.[4] The technology discussed was new, and the subsequent naivety surrounding its security flaws meant the authors overlooked such a possibility. A quarter of a century later however, these risks were exposed when the FBI informed the Obama and McCain campaigns that a “foreign entity,” suspected to be of Chinese or Russian origin, had compromised their systems, and stolen a “serious amount of files,” with the likely desire to inform future negotiations with the next leader of the United States.[5] This was just a prelude to what was to follow.

In April 2016, staff in the DNC’s IT department identified unusual activity on their system. The Democrats employed CrowdStrike—a private cybersecurity firm—to determine if there had been a breach of their computer networks, and if so, how deep it went. The investigation revealed the DNC’s whole system had been compromised, and that data on contributors, opposition research, and day-to-day inter-office emails had been stolen. Worse, the analysts assessed the hackers had compromised the system in 2015, and had been engaged in the systematic theft of data for over a year.[6] On July 28, hackers attacked the Democratic Congressional Campaign Committee’s servers and successfully stole personal information from the party’s donor and supporter databases. The next day the Clinton campaign’s servers were attacked, with the hackers stealing over 20,000 emails from John Podesta, the campaign chairman. Former FBI agent and security specialist Ali Soufan has described what followed as “the overt and weaponized use” of stolen data. Published online through a combination of WikiLeaks, a site called DCLeaks, and a WordPress blog claiming to be that of a lone hacker going by the handle Guccifer 2.0, the leaks laid bare the inner workings of the DNC and Clinton campaign to devastating effect. They fomented civil war among Democrats over the committee’s treatment of the Sander’s campaign; undermined confidence in the Clinton campaign’s competence; and provided the rival Trump campaign with an endless flow of attack points, drawn from sensitive internal discussions between Clinton and her team.[7]

On October 7, 2016, the Department of Homeland Security and Office of the Director of National Intelligence (ODNI) issued a joint statement declaring the Russian Government had directed the cyberattacks, with the express intention of interfering with the 2016 presidential election.[8] The statement also warned of probes into the election-related systems in a number of states; attacks the FBI had described as “highly sophisticated,” rated “eight on a scale of ten” in severity, and attributed to the Russian state.[9] Three months later, following Clinton’s surprise electoral defeat, the ODNI released the declassified portion of a joint report, compiled by the CIA, FBI, and NSA. It assessed with “high confidence” that Russia’s President Vladimir Putin had ordered the online influence campaign, the goal of which had been to “undermine public faith in the US democratic process, denigrate Secretary Clinton, and harm her electability and potential presidency.”[10] In response, members of Congress called for the formation of a bipartisan select committee to “devise comprehensive solutions to deter and defend against further cyberattacks.”[11] This was however, not the first time calls had been raised to improve the security of America’s computer networks, and based upon the nation’s poor grasp of its own cyber history, will not the be the last. While professional and well resourced, the DNC’s attackers had not been invisible. Early in the fall of 2015, the National Security Agency (NSA) had detected the familiar signature of a foreign entity attempting to gain access to the DNC’s servers. Yet despite the vast protective power that Fort Meade boasted, the committee was a private political organization, and thus, based upon the result of a bruising legislative defeat Congress had inflicted upon the Reagan administration in the mid-1980s concerning the Pentagon’s authority to intervene in civilian computer networks, its servers were out of bounds for the NSA. Accordingly, Fort Meade’s engineers provided the DNC’s staff with all they were entitled to give—a heads-up and some advice on the potential threat.[12]

“Anyone skeptical of the growing role cybercapacity plays in the conduct of foreign affairs,” noted Jared Cohen, president of the think tank attached to Google’s holding company Alphabet, “just needs to look at the most recent presidential election, in which cyberespionage, leaks, and data security were central issues.”[13] Jason Healey, director of the Atlantic Council’s Cyber Statecraft Initiative, spelled out the problem with putting such events into a useful context however, warning “the U.S. government and military have almost completely ignored cyber history.” A combination of factors have contributed to this: the regular influx of new entrants into cyber-related roles, whose high-tech future-facing posture means they have little contextual knowledge of the past, and even less of an inclination to seek it. In addition, the non-intuitive nature of the cyber domain has created mutual misunderstanding between what Healey describes as the cyber “geeks” and policy “wonks.” Nevertheless, to accuse the government of ignoring cyber history presumes the opportunity to access it in the first place. Detailed accounts of America’s conduct in cyberspace are only just starting to emerge, and even now authors are constrained by the detrimental effects of excessive government secrecy in this field. Much of the official material is classified, leaving scholars to assemble their work from media stories, open sources, and frequently rejected FOIA requests.[14] Thus, policymakers have been denied important historical insights that can chart the path of America’s current cyber issues. This article begins to address this deficit by exploring how cyber security emerged as a policy issue, the Reagan administration’s efforts to utilize the NSA to protect domestic computer networks in response, and how broad opposition to this approach eventually led to Fort Meade’s exclusion from private domestic computer networks. In doing so this paper reveals the enduring reason behind America’s seemingly new cyber insecurity is not technical but political, spawned from the perpetual debate of how best to balance national security with the freedoms necessary to preserve the world’s leading democracy. By contextualizing the cyber security debate, this article ensures contemporary policymakers are better placed to approach the issue, and seek a political solution which takes into account the fact that the NSA is quite rightly constrained by the very democratic ideals it was created to protect.

The emergence of computer security as a policy issue

Reagan came to office at a time when the importance of computer networks, and what came to be cyberspace, fully emerged. American governmental departments, military, and private business investment ensured networked machines were no longer the isolated pursuit of the researchers and computer scientists who first developed the technology, but mainstream systems imbedded in the day-to-day running of the nation. Four years earlier the U.S. government had adopted IBM’s Digital Encryption System (DES) as the Federal Information Processing Standard for encryption of sensitive government and commercial computer data. It was the first encryption algorithm approved by the government for public disclosure, and industries such as financial services quickly utilized it as their business moved online. In 1982, the engineers behind the ARPANET, the packet switching computer network first funded by the Department of Defense’s (DoD) Advanced Research Projects Agency (ARPA) in 1967, introduced Internet Protocol Suite (TCP/IP) as the network’s standard protocol, establishing the technological foundations of the Internet.[15] The following year saw MILNET—the portion of ARPANET used for unclassified DoD traffic—separated and spun out to become the Defense Data Network. This worldwide set of networks ran at different security levels and carried all of the DoD’s digital communications until its eventual replacement by the Non-classified Internet Protocol Router Network (NIPRNET), and more secure Secret Internet Protocol Router Network (SIPRNET) in 1995. The academic domain of the National Science Foundation Network (NSFNET) was spun two years after MILNET, enabling research partnerships across the United States between publically funded institutions and private industry. The NSF’s creation of the Computer Science Network, open to researchers from any institution willing to pay the annual subscription, resulted in further online growth as tens of thousands of new users signed up.[16]

The willingness of users to pay for access to computer networks saw the Internet enter its commercial phase by the mid-1980s. Budget limitations, combined with the Reagan administration’s neoliberal preference for privatization impelled the government to withdraw from participation in the Internet’s structure, leaving private telecoms companies and public research institutions to collaborate on future expansion. Rapid proliferation of civilian computer networks followed, with corporations such as IBM and Sprint working alongside the non-profit Merit Network and the State of Michigan to operate the NSFNET, one of the major backbones of the Internet.[17] In 1988, NSFNET was upgraded to handle more than 75 million packets of data per day. The same year, AT&T, British Telecom and French Télécom buried the first transatlantic fiber optic cable, capable of handling 40,000 telephone calls simultaneously. Continued innovation related to the speed and capacity of the Internet yielded further expansion, as smaller TCP/IP networks developed by research institutions across the world merged with NSFNET. As it stretched over borders, the architecture of the Internet as we know it today—a globe-spanning labyrinth of interconnected networks - emerged, with 159,000 hosts by 1989.[18] That same year, IBM funded a network link between Cornell University and the European Council for Nuclear Research (CERN) in Geneva, which the computer scientist Tim Berners-Lee used in August 1991 to demonstrate the program he had created allowing users to access, link and create communications in a single global web of information.[19] Superimposed on the Internet, the incorporation of the program’s protocols marked the birth of the World Wide Web, the culmination of the collaboration between different strands of private and public sector innovation.[20]

Despite the rapid proliferation of America’s computer networks and formal adoption of encryption protocols, the 1980s began with Reagan inheriting a cryptologic sector in perilous shape. High inflation during the 1970s caused sustained real term funding cuts, leading officials to reduce the workforce charged with securing this emergent domain from 88,600 in 1970, to just 41,000 by 1979.[21] The Reagan administration quickly addressed the decline in cryptologic specialists but not in response to the vulnerability of America’s increasingly networked society. Reagan, like many policymakers, had little interest in or understanding of the technical details of computer networks. He was, however, riveted by the possibilities that the NSA’s new technologies offered in terms of combating the Soviets. Keen to reverse their financial fortunes and cash in on the arms buildup Reagan had campaigned on, staff from Fort Meade briefed the president-elect on their recently developed electronic countermeasures. These tools, collectively referred to as counter-command and control warfare, could intercept, disrupt, or even sever an enemy’s communications, offering a strategic edge by degrading their ability to wage war. Theoretically, this new technology presented the opportunity for a quick and decisive victory should conflict break out with the USSR. The ability to break the superpower stalemate appealed to the Cold War hawk, while the strategic possibilities pleased his new Secretary of Defense, Casper Weinberger. The exact sum invested into the NSA is classified, but total intelligence funding grew by 125 percent in real terms from 1980 to 1989.[22] The dramatic increase in staffing at Fort Meade provides some indication of the significant portion of that sum the agency secured, with its population rising from 19,018 in 1983 to 26,679 by 1990.[23]

While promptly sold on the utility of computer networks for offensive purposes, the Reagan administration’s awareness of the less glamorous but equally important job of protecting America’s systems lagged. For as long as there had been networked computers, engineers had warned of the vulnerabilities such technology introduced. At the April 1967 Joint Computer Conference in New York, Willis Ware, head of computer science at RAND and a pioneer of the technology since the 1940s, presented a paper titled “Security and Privacy in Computer Systems.” It warned of the risk of “information leakage” of classified material from new “on-line” networks like the ARPANET through accidents or espionage (figure 1).[24] This prompted ARPA to fund a Defense Science Board (DSB) task force on computer security, chaired by the RAND engineer himself. Published in February 1970, the Ware Report, the first formal government attempt to address the fundamental security problems facing a networked world, highlighted themes still recognizable today. It warned that hackers would seek to “obtain information, cause the system to operate to the advantage of the threatening party or manipulate the system to render it unreliable or unusable.” In an attached memorandum, Ware warned that “providing satisfactory security controls in a computer system” was in of itself “a system design problem.” A combination of hardware, software, communication, physical, personnel, and administrative-procedural safeguards would be required for comprehensive network security.[25] The report concluded that, given the proliferation of networked computers, the task of securing computer systems would quickly transcend the DoD, meaning the computing industry would need to take responsibility for supplying hardware and software with appropriate safeguards. As a result he urged the Pentagon to circulate the report as widely as possible.[26] The DoD’s obsession with secrecy in the computing field however saw the report remain classified for a decade. This self-defeating act denied those responsible for overseeing networks its valuable recommendations at the exact time when new standards and guidelines were required.[27]

Figure 1:

[pic]

Willis Ware, “Security and Privacy in Computer Systems,” RAND, April 1967

The warnings contained in the Ware Report were not entirely overlooked, and as the 1970s drew to a close, a small group deep within the bureaucracy of the government sought to address the looming threat of computer insecurity. The Director of Information Systems in the Office of the Secretary of Defense Stephen Walker, a member of the team that developed the ARPANET, reached out to the director of the NSA (DIRNSA), Bobby Inman.[28] Walker believed that Fort Meade, as the home of America’s best-resourced computer specialists, should be responsible for securing the federal government’s networks. Inman agreed, and appointed his science and technology chief, George Cotter, to head up the matter. Cotter and Walker worked closely to create the DoD’s Computer Security Center (CSC), formally established on January 1, 1981. Originally conceived as a separate entity with its own staff working directly for Cotter, budget transfers prompted by the Reagan administration’s focus upon the NSA’s offensive capabilities meant the fledging center operated out of borrowed space on Fort Meade’s campus. Despite receiving an upgrade to the Computer Security Evaluation Center (CSEC) by DoD directive in 1982, over-zealous secrecy, policymaker disinterest, and an eclipsing focus on the strategically attractive benefits of cyber offense ensured that the Reagan administration overlooked the issue of cyber defense, even as the administration’s agencies were purchasing computers and linking them to networks at a tremendous pace.[29]

The vulnerability of America’s computer networks came to the public’s attention in August 1983, when six Milwaukee youths aged between 16–22 made headlines after being arrested by the FBI for having orchestrated a sustained hacking campaign. The friends, who met as members of the local Explorer Scouts, had formed a hacking collective named the 414s after the city’s telephone area code—a parody of Milwaukee’s inner-city gangs who drew their names from the street numbers of their turf, and a nod to the group’s hacking tool of choice—the modem. Operating from their suburban houses on their parents’ home computers, the 414s successfully broke into more than 60 government and business computers across the United States and Canada, including the Los Alamos National Laboratory. The FBI eventually tracked and arrested the hackers by using a wiretap to trace the addresses associated with the offending modems, following a tip-off from the administrator of a system they had breached. The young men, whose digital trespassing was motivated by curiosity rather than malicious intent, were not prosecuted. However, the ease with which suburban youths working with rudimentary skills and inexpensive equipment had exploited security flaws in major computer networks was a wakeup call to U.S. policymakers. Congress was shocked into action, inviting Neal Patrick, a 17-year old member of the 414s, to testify before the House in September 1983 (figure 2).[30] In a separate session, Deputy Assistant FBI Director Floyd Clarke told a House subcommittee that a computer can be used much like “a gun, a knife or a forger’s pen.”[31] Seeking to establish a deterrent, Congress enacted the first federal computer crime statute in March 1984. The bill linked hacking to America’s national security, making the unauthorized acquisition and disclosure of restricted data on national defense or foreign relations a criminal offense. It also outlawed attempts to access financial records, and the hacking of any computer used for government business.[32]

Figure 2:

[pic]

L – Neal Patrick. Courtesy of Patrick Family Archive. R – Neal Patrick testimony before the House, September 1983. Courtesy of The 414s: The Original Teenage Hackers, Dir. Michael T. Vollmann, CNN Films, January 2013.

The DoD’s shift of resources to its new MILNET coincided with the exposure of the security flaws across America’s networks. An internal security review of the estimated 16,000 computers across the department concluded that the majority processed information judged to to be “of value to an adversary.” Furthermore, many of the DoD’s computers were networked with consoles at the 13,000 cleared defense industrial firms.[33] In the wake of the disclosures, anonymous Pentagon officials were quoted in the New York Times as being “increasingly concerned” about the “future security” of these networks. In the same piece, the Times cited an FBI special agent who monitored computer bulletin boards: “Some hackers spend 12 hours a day trying to break into computers at the CIA or the Pentagon,” he revealed.[34] With confidence among senior staff and policymakers shaken, Weinberger ordered a high-level investigation of computer security at all military and intelligence installations.[35] Assistant Secretary of Defense for Command, Control, Communications, and Intelligence (ASD(C3I)) Donald C. Latham was at the forefront of the review. Rather than downplaying the negative headlines, the former chief NSA engineer took the opportunity to talk up the risks posed: “Over the past decade there has been an incredible growth in the amount of classified information stored in computers,” he told the press. “There’ll be more of these hackers, and we’re going to have to deal with their increasing sophistication.”[36] Latham’s motivation in raising the stakes derived from his desire to win support for a new cyber security directive the president had, in a visionary move, personally ordered two months prior to the 414 scandal.[37]

Frances Fitzgerald has argued that Reagan was partially inspired to pursue his Strategic Defense Initiative (SDI) by the plots of two Hollywood films.[38] Similarly, the motivation for the former actor to address America’s cyber insecurity has its roots in a movie. In 1980, Lawrence Lasker and Walter Parkes, two former Yale classmates in their late twenties were undertaking research for their first screenplay, the techno-thriller WarGames in which a high-school hacker nearly triggers nuclear war with the Soviet Union through an accidental intrusion into NORAD’s computers. Seeking to ensure the plot was as at least somewhat plausible, Lasker, who lived just a few blocks from RAND’s Santa Monica offices, contacted the think tank’s public affairs office to see if he could question one of their computer scientists. Much to his surprise, the young screenwriter was put in touch with RAND’s deputy vice president, Willis Ware, who sensing an opportunity, invited Lasker and Parkes in for a candid conversation. The veteran computer scientist allayed the writers’ concerns that the plot was too far-fetched, informing them that ports to DoD computers were frequently left open by officers who wanted to work from home, and that “anyone could get in, if the right number was dialed.”[39] Ware’s blunt honesty about the vulnerability of even the most sensitive military networks gave the writers the confidence to push ahead with their project.

When the film was completed three years later, Lasker called in a family favor. The screenwriter’s parents were in the movie business, and counted the Reagans as friends. Lasker exploited the connection to get the finished thriller screened at Camp David on Sunday 4 June, the opening weekend of its 1983 release.[40] The screenwriter was not the only one exploiting the connection: Ware was a long-time member of the National Security Agency Scientific Advisory Board (NSASAB). The board’s ten-member panel, formed shortly after the establishment of the agency, was composed of outstanding technical consultants, recruited from university campuses, corporate research labs, and America’s burgeoning think tanks. Its purpose was to provide what the NSA described as “a valuable source of advice and assistance in solving special problems in the cryptologic field.”[41] Two decades after joining the board, Ware sought to leverage Hollywood’s dramatization of a hacking incident to grab policymakers’ attention and direct it toward the overlooked security flaws he had first warned of 16 years prior.[42] The unorthodox approach worked. The film—a departure from the typical light fare of comedies, adventures, and musicals Ronald and Nancy typically enjoyed together—had an immediate impact. Three days after the screening, Reagan interrupted a meeting with his senior national security staff, scheduled to discuss forthcoming nuclear arms negotiations, to describe the film’s plot before asking General Vessey, the chairman of the joint chiefs, whether something like that could really happen. Vessey told the president he would look into it, and tasked Latham, the Pentagon’s liaison with the NSA, to investigate. The deputy secretary, a close confidant of Ware who had served on the NSASAB with him, relayed the computer scientist’s warnings. One week after Reagan’s question, Vessey delivered an answer that prompted the Reagan administration into belated action on computer security. “Mr President,” the general announced, “the problem is much worse than you think.”[43]

Figure 3:

[pic]

President Reagan and Nancy Reagan watching a film. Picture number C33562-32, courtesy of Ronald Reagan Presidential Library.

As the Pentagon staffer responsible for the management and oversight of all DoD information technology, the chief liaison with the NSA, and a former engineer at the secretive agency, Latham (Figure 4) seemed an obvious choice to work up the administration’s new directive on computer security. His experience meant he was fully aware of the covert hacking operations the NSA had been conducting against the Soviets and Chinese, and the security flaws exploited to undertake them. Furthermore since taking up his post as the Pentagon’s ASD(C3I) in July 1981, Latham had been pushing against what he described as “a lack of national will” to secure America’s information systems.[44] Weinberger promptly instructed him to create a comprehensive plan to address the concerns Reagan had voiced, an order that resulted in the creation of National Security Decision Directive 145 (NSDD 145). Unfortunately, rather than ushering in a period of greater security, the introduction of the directive marked the beginning of a sustained clash between computer security, government surveillance, and privacy. Latham’s time at the NSA gave him important insights into how the most enigmatic federal agency conducted its operations, but it also shaped his views on what the government’s response ought to be in a way that made NSDD 145’s recommendations extremely controversial, provocative, and ultimately untenable.

Prior to NSDD 145, Carter’s 1979 Presidential Directive (PD 24) had vested authority for protecting classified telecommunications in the DoD, while the Commerce Department safeguarded unclassified communications.[45] NSDD 145 signaled a major shift, extending the policy to cover computer as well as telecomm systems, and centralizing control over both classified and unclassified federal systems under the NSA.[46] The logic behind Latham’s proposed reform was that the best way to defend America’s networks was to understand an adversary’s plans and methods of attack—a process that required getting inside their networks. Thus, as the NSA expanded its capabilities to hack foreign networks, it also enhanced its defensive proficiency. To capitalize upon this, the directive called for the transformation of the NSA’s previously under-resourced CSEC into the National Computer Security Center (NCSC), with the new hub serving as a fusion center for staff whose work entailed exploiting foreign systems with engineers responsible for defending America’s networks.[47] In seeking to meld what eventually become categorized as Computer Network Attack (CNA) and Computer Network Defense (CND), Latham proved himself far-sighted.[48] It would not be until the mid-2000s that combined offensive/defense measures in cyberspace would officially be recognized as a new category of operations, Computer Network Exploitation (CNE), but in employing the NSA as the primary defenders of all federal computer networks, NSDD 145 revealed an early adoption of this approach.[49]

In addition to harnessing the NSA’s technical skills, Latham’s decision to place Fort Meade’s agents on the front lines of cyber defense revealed a further calculation. Fending off hackers was—and still is—an intrinsically difficult task. “Can my system really be penetrated?” asked an anonymous NSA author rhetorically in a March 1979 issue of an internal agency journal. “The inevitable answer is ‘Yes,’” continued the author. “Any computer system can be penetrated by a knowledgeable user.”[50] A primary reason for this is that the inherent nature of cyberspace favors the attacker. The domain has no formal boundaries. As the 414’s young members had revealed, all a hacker requires is a computer with an Internet connection to obtain access to cyberspace, from where they can attempt to penetrate any network they wish. Much like terrorism, the balance of probability puts the defender, who must maintain the constant security of their network, at a disadvantage to the attacker who only needs to be lucky once.[51] Hacking is, as the computer engineer Clifford Stoll warned following KGB penetration of a network he oversaw, a very “cost-efficient” form of espionage, “insulated from risks of internationally embarrassing incidents.”[52] By charging the NSA with the security of America’s networks, Latham sought to adjust the risk model. Adversaries would probe America’s networks with the knowledge that, if discovered, they themselves could be hacked by the NSA’s watchdogs. Like a less destructive form of mutually assured destruction, the risk of a counter-hack was intended to add an additional layer of security, raising the stakes and, in theory, deterring hackers.

The fact that the management of North America’s online networks spreads across many federal agencies, research institutions, and private sector providers adds a further layer of complexity to the task of mustering a coherent defense against cyber attacks. At the time of NSDD 145’s creation, the private sector controlled between 95 to 98 percent of the DoD’s online networks in the United States.[53] This interdependent relationship has made it necessary for policymakers to ensure that government agencies work in concert with private-sector organizations, despite often-divergent goals. Robert C. Hanyok, an NSA employee who spent two years in Fort Meade’s S Organization evaluating the security of various computer networks, shared his frustration with colleagues in a 1982 internal article titled “Some Reflections on the Reality of Computer Security.” IT officials would typically insist that their systems were “ironclad and invulnerable,” noted Hanyok, but “the reality,” he lamented, was that “computer security measures are often undercut by user practices and less-than-adequate implementation.” Careless use of shared consoles, failure to update protection protocols, simplistic or compromised passwords, and dozens of other errors were, the engineer reflected, often caused security flaws.[54] Rather than being a case of incompetence, however, many of the grievances Hanyok cited about the private networks upon which the government relied were actually the result of conscious design decisions. In a discussion about data security at the Center for Information Policy Research in spring 1986, Robert Conley, the former deputy assistant secretary for advanced technology and analysis at the Department of the Treasury from 1983 to 1985 and an 18-year NSA veteran, provided further context: “Today’s computers are not designed for security.” Instead, manufacturers had concentrated on increasing the machines’ data storage capacity, and speed of access to that data. “The faster the retrieval a machine has,” Conley noted, the “easier [it is] to sell.” Consequently, many of the computers used by the private sector and research institutions in the mid-1980s did not even include security options.[55]

In order to intervene in the market driven insecurity created by the computing industry’s focus upon speed over security, NSDD 145 formed the National Telecommunications and Information Systems Security Committee (NTISSC). Chaired by the Pentagon’s ASD(C3I), (Latham himself), the committee’s fourteen member board included eight other senior DoD personnel and a permanent secretariat composed of NSA staff. This Pentagon-dominated body was responsible for “develop[ing] operating policies” on “technical matters” involving computer security. The committee’s remit extended beyond government, empowering it to review private sector systems that handled “sensitive non-governmental information” with the purpose of “assisting [ . . . ] in applying security measures.”[56] The directive defined sensitive information as anything that while “unclassified in isolation,” could reveal “highly classified and other sensitive information when taken in aggregate.”[57] When questioned by Congress in June 1985, Latham explained that under NSDD 145 “sensitive information might include anything from crop forecasts to personnel records.”[58] Testifying again three months later, he seemed to scale back the scope of the NSA’s reach, stating specifically that non-national security-related information was not included in the purview of the NTISSC. In October 1986, however, an NSA memorandum, NTISSP No. 2, reaffirmed the agency’s authority to safeguard data “related but not limited to a wide range of government or government-derived economic, human, financial, industrial, agricultural, technological, and law enforcement information.” In addition, it extended the purview of Fort Meade’s staff to include oversight of “personal or commercial proprietary information provided to the U.S. government by its citizens.”[59] This gave the NSA near free range to intervene in the security policies and technology adoption of any private sector organization that handled data deemed to be sensitive under NSDD 145’s vague, sweeping definition.[60]

Finally, Latham’s directive also sought to boost the NSA’s political influence by embedding computer security into the policymaking process. In order to ensure the matter would be a permanent fixture of White House security discussions, NSDD 145 established a senior level steering group, chaired by the president’s national security advisor, to ensure the National Security Council was always kept “in the loop” on all computer security developments. In addition, the director of science and technology policy—the president’s most senior science advisor—was to chair an NSC special coordinating subcommittee on telecommunications protection. The reform also bestowed the role of “National Manager for Telecommunications and Automated Information Systems Security,” upon the DIRNSA, a position which conferred the authority to “review and approve all standards, techniques, systems, and equipments for telecommunications,” significantly increasing the influence the director could bring to bear in shaping national computing policy.[61]

The Battle over NSDD 145

Officially introduced on September 17, 1984 as the “National Policy on Telecommunications and Automated Information Systems Security,” NSDD 145 was, in the words of the NSA’s own historical account, a “lightning rod for dissent both within Fort Meade and in the outside world.”[62] Alarmed spies baulked at the idea of having to divert time from what they saw as their organization’s core business of espionage to becoming caretakers of America’s domestic systems. Meanwhile concerned members of Congress questioned the wisdom of allowing an intelligence agency the primary role in securing federal and private sector networks. Within months of NSDD 145’s introduction, Representative Jack Brooks, (D-TX), chair of the House Government Operations Committee and an active campaigner on information security and privacy matters since the 1960s, criticised the directive when he learned that under its auspices, the NSA was investigating a computer program widely used in both local and federal elections the previous year. As an online system linked directly to the integrity of the political process, the NTISSC had deemed the software to be exactly the sort of critical system NSDD 145 was established to protect. Brooks countered that through its investigation, the agency had obtained detailed knowledge of how the software worked, and thus an understanding of how to exploit it. “In my view,” the lawmaker warned, “this is an unprecedented and ill-advised expansion of the military’s influence in our society.” “I’m really concerned about the NSA’s involvement,” echoed Congressman Dan Glickman (D-KA), the chairman of the House Science and Technology Subcommittee Concerned with Computer Security. He noted, “The computer systems used by counties to collect and process votes have nothing to do with national security.”[63]

The concerns voiced by members of Congress over the NSA’s expanded role in domestic affairs reflected the end of what Loch K. Johnson dubbed the “Era of Trust” between Congress and the intelligence community, brought on by Watergate and disclosures during the Church and Pike Committee’s mid-1970s hearings.[64] Initially established to investigate potential domestic abuses by the CIA, the discovery of further dubious practices by the wider intelligence community saw the committees quickly expanded their focus. This included two NSA operations—MINERET and SHAMROCK—in which the agency had intercepting and monitored overseas phone calls of American citizens and collaborated with international cable companies to read citizens’ private messages respectively.[65] A Justice Department investigation revealed efforts by Fort Meade’s agents to limit the scope of its surveillance, such as ensuring it was only intercepting “foreign communication,” i.e. those having at least one terminal on foreign soil, and careful filtering of telegrams and telex messages via watch list criteria.[66] Despite this, the investigations still revealed the NSA had listened to the private conversations of some 1,600 Americans, and was intercepting approximately 150,000 telegram and telex messages per month. While a fraction of the millions of calls placed, twenty-four million international telegrams, and fifty million telex messages that entered, left, and transited the United States each year, it was still a breach of the agency’s charter, which barred domestic surveillance. Of greater concern was the fact that the vast majority of the citizens caught up in this electronic dragnet were not readily identifiable as reasonable sources of foreign intelligence information, appeared to pose little threat to the national security, and were not alleged to be involved in any criminal activity.[67]

Foreshadowing congressional concerns over NSDD 145, the Church Committee’s final report had warned that the “NSA’s potential to violate the privacy of American citizens was unmatched by any other intelligence agency.”[68] Speaking on NBC’s “Meet the Press” in August 1975, Senator Church addressed the double-edged nature of the agency, whose surveillance capabilities made it a vital tool in ensuring America’s national security, but that at the same time “could be turned around on the American people and no American would have any privacy left.” “If a dictator ever took charge of this country,” Church pontificated, channeling a post-Watergate tone, “the technological capacity that the intelligence community has given the government could enable it to impose total tyranny.”[69] The root of this danger and the cause of the NSA’s abuses, the Church Committee concluded, was the unsettled nature of the law relating to the rapidly evolving field of communications and privacy, and subsequent absence of clear congressional or judicial guidelines. In reacting to what Latham described as “the information explosion” of the 1980s, the lesson Congress took from the Church committee’s findings was that lawmakers needed to guard against the inevitable expansion of the NSA’s role in the face of ever-advancing technological change.

In response to the Church and Pike investigations, Congress initially reasserted its authority over the intelligence community, imposing new restraints on covert actions and drawing the oversight committees more deeply into the formulation of intelligence policy.[70] Nevertheless, upon the introduction of NSDD 145 some in Congress, such as Brooks and Glickman, already felt that the intelligence community, boosted by the Reagan administration’s determination to “unleash” its capabilities, was reasserting its independence.[71] In 1978, Congress passed the Foreign Intelligence Surveillance Act (FISA), establishing the highly classified Foreign Intelligence Surveillance Court, sealed away behind a cipher-locked door on the top floor of the Justice Department. The court was to act as a check upon the sort of operations that had led to the NSA spying on its own citizens. Yet the judges had not denied even one of the 2,606 electronic surveillance applications requested by the government between May 1979 and December 1984, leading critics to accuse the court of being little more than a rubber stamp. What is more, by the time Latham sought to increase the NSA’s domestic role, the pace of surveillance requests and thus the scale of the government’s spying had increased dramatically from 199 in 1979, to 635 in 1984.[72]

In addition to utilizing the loopholes and vague wording of the FISA statue, the Reagan administration overturned a number of post-investigation restrictions. On February 18, 1976, President Ford issued Executive Order 11905, boosting congressional oversight and limiting the most controversial activities of the intelligence community, such as assassination.[73] Two years later, Carter—who had referred to the CIA as one of three “national disgraces” (along with Vietnam and Watergate)—collaborated with Congress to further limit the intelligence community’s freedom to act covertly, signing Executive Order 12036 on January 10, 1978.[74] The Reagan administration, facing an unprecedented rise in terrorism-related threats, sought to remove these restrictions, which, according to Reagan’s director of central intelligence, William Casey, had brought covert operations “screeching to a halt.”[75] A working group chaired by CIA (and former NSA) general counsel Daniel B. Silver began drafting Executive Order 12333 (EO 12333). Simply titled “United States Intelligence Activities,” the leaking of an early copy in March 1981 exposed the political sensitivity of the matter, with the government facing a storm of protest over its alleged intention to authorize intrusive intelligence collection techniques.[76] Nevertheless, after two more unsuccessful attempts, Reagan promulgated a modified EO 12333 on December 4, 1981. Primarily focused upon expanding CIA activities, sections of the order also related to the NSA. It placed the power to authorize physical or electronic surveillance, both “within the United States or against a United States person abroad,” in the hands of the Attorney General, providing he/she found “probable cause” to believe that the penetration would be directed against a “foreign power or an agent of a foreign power.” Furthermore, the order authorized the NSA t to lend its full cryptanalytic support to any department or agency in the federal government and, “when lives are endangered,” to police departments.[77] So great was the agency’s empowerment to function domestically that following Edward Snowden’s disclosure of NSA surveillance programs in 2013, John Napier Tye, an ex-section chief for Internet freedom in the State Department, wrote an op-ed identifying EO 12333 as the primary tool underwriting the surveillance of American citizens.[78]

In the context of these events, it is clear to see how some lawmakers interpreted NSDD 145 as a heavy-handed power grab by the intelligence community. But suspicion about NSA motivations was not confined to Congress. Many in the private sector had a lingering sense that the agency was seeking new ways to penetrate their communications and control the development of technology. In 1976, the National Bureau of Standards (NBS) consulted the NSA over the decision to adopt IBM’s encryption algorithm. Shortly afterward, as details emerged that IBM had altered the algorithm at Fort Meade’s request, rival firms accused the agency of deliberately undermining their bids. In the wake of the Church Committee’s revelations, competitors alleged that the NSA had promoted IBM’s algorithm over others as Fort Meade’s engineers had installed a “trapdoor” that they could use to eavesdrop on American citizens. A subsequent investigation by the Senate Select Committee on Intelligence found no evidence of wrongdoing. The committee’s report did, however, touch on the “potential capriciousness which is possible in ambiguous and uncertain situations,” such as an when an agency predominantly focused upon espionage, with a track record of spying upon Americans, is involved in the protection of private communications. It recommended that the appropriate congressional committees needed to provide greater transparency by “clarifying the role which the Federal Government should have in policies affecting public cryptology,” something the classified discussions of NSDD 145’s newly established NTISSC did not address.[79]

While the Senate may have rejected accusations of NSA wrongdoing, the agency’s involvement in awarding IBM a lucrative government contract demonstrated the power Fort Meade’s engineers could wield. In the years preceding NSDD 145, the NSA increasingly began to exert its influence in this economic manner as its computer scientists grew progressively more concerned over the direction of the private sector’s interest and investment into computing.[80] On October 25, 1982, the DoD unveiled the NSA’s newest addition: the Computer Security Evaluation Center (CSEC), an evolution of the CSC established the previous year and a precursor to the NCSC formed by NSDD 145. CSEC’s purpose was to analyze computer hardware and software produced by private industry and rate products in terms of their potential vulnerability to penetration.[81] Although submission for evaluation was voluntary, Inman’s successor as DIRNSA, Lieutenant General Lincoln D. Faurer, had left little doubt that manufacturers ignored the center’s inspections at their peril. Addressing a meeting of the Institute of Electrical and Electronics Engineers in 1981, Faurer explained that the NSA would “significantly reward those DoD suppliers who produce the computer security products that we need.”[82] The NSA’s empowerment through NSDD 145 to review any private sector system deemed by the NTISSC to handle sensitive information was the legislative formalization of the agency’s previous carrot-and-stick approach. The broad authority to assess all manner of systems employed in both the government and private sectors provided Fort Meade’s engineers the opportunity to address their frustrations with the computing industry. Under the threat of poor security reviews and the ensuing exclusion from lucrative government contracts, the NSA was able to exert enormous pressure upon manufacturers to prioritize security features over the rapid data access most consumers demanded. This fact did not escape the frustrated CEOs of America’s rapidly expanding technology companies, who opposed NSSDD 145 as an excessive extension of government into the free market.

Academia too bristled at the connotations of NSDD 145. Since the late 1970s, staff at Fort Meade had become increasingly concerned about the rising interest among academics in the previously exclusive domain of cryptology, triggered by the rapid proliferation of computers. A serious of clumsy interventions had failed to address this concern, and had raised tensions between academics and the NSA in the process. University researchers and technology entrepreneurs jointly castigated the agency for undermining the American spirit of innovation through its use of the Invention Secrecy Act against a number of communication-related patents. Scientists and mathematicians accused the agency of censoring their academic freedoms when a Fort Meade employee threatened them with prosecution under export laws if they discussed their cryptography findings with foreign scholars. Moreover, academics collectively rounded upon the NSA for stifling scholarly creativity by pressuring the National Science Foundation to withhold funds from cryptography-related research grants.[83] Concerned about the impact on future recruitment from the “bum rap” the NSA had received, Inman took an unprecedented step and spoke directly to the press, using a 1978 interview to set out his agency’s concerns. The DIRNSA explained that American research into cryptography, coupled with academia’s widespread dissemination of resulting discoveries, raised the risk that America’s opponents would adopt the crypto analytic techniques employed by the NSA, enhancing their defenses while undermining the security of America’s computer networks. “While some people outside NSA express concern that the government has too much power to control nongovernmental cryptologic actives,” Inman shared candidly, “my concern is that the government has too little.” He hoped that the NSA would receive the same authority over cryptology that the Department of Energy enjoyed over research into atomic energy—namely “born classified” control over all research in any way related to the field.[84]

Inman’s outreach to the academic community resulted in the establishment of the Public Cryptography Study Group. Created by the American Council on Education in the spring of 1980, it sought to address the NSA’s security concerns, while acknowledging the constitutional questions over what academics deemed to be the agency’s efforts to impose censorship upon their research.[85] The group released its report on February 7, 1981, proposing the compromise of voluntary censorship. The NSA reserved the right to notify any academics undertaking cryptologic research of its desire to review their findings prior to publication, and should the work trigger any security concerns, request those individuals refrain from publication. In case of disagreement, a five-member advisory committee appointed by the director of NSA and the science advisor to the President would make the ultimate recommendation.[86] Predictably, researchers’ fiercely resisted NSA oversight and take up to the review system was incredibly poor. In response, Inman, having retired as DIRNSA and serving as deputy director of CIA, warned a 1982 meeting of the American Association for the Advancement of Science that the continued “hemorrhaging of the country’s technology” would cause a shift toward “legislated solutions” that were “likely to be much more restrictive [ . . . ] than the voluntary censorship system.”[87] Thus upon its introduction, the academic community vehemently opposed NSDD 145, regarding it as the legislated solution Inman had threatened.

It was not just those working on cryptologic techniques that found their work swept up in NSDD 145’s expansive remit. By 1986, there were 3,200 electronic databases available worldwide through 486 online information services, seventy percent of which were produced in North America, many hosted by the university departments that had contributed to the original foundations of the Internet over the past decade.[88] In 1982, the CIA warned of the Soviet Union’s sustained theft of intellectual property from the databases run by these research institutions.[89] Late in 1985, Langley released an updated version of that report, listing the networks of sixty-two American universities targeted for scientific and technical espionage. The Soviets primary aim was the acquisition of information related to applied technology and engineering, although the report noted their agents also sought to gather “fundamental research for both Soviet military and civilian-related science developments.”[90] Further CIA analysis warned that Gorbachev’s industrial modernization plans, mixed with aggressive Western enforcement of strict export controls, could lead the Soviets to place “even greater emphasis on covert acquisition” of technological data.[91] Officials’ concerns, combined with NSDD 145’s broad definition of what constituted sensitive information, ensured academic databases, computer networks, and the founding portals of the Internet forged in the spirit of scholarly collaboration fell under the NSA’s purview, confirming the academic community’s fears regarding government intervention and censorship, and prompting more resistance from academics to Latham’s directive.[92]

The biggest concern fo fo r privacy campaigners with NSDD 145 was not what was in the directive, rather what was absent. The directive lacked safeguards, presenting enormous potential that the NSA would abuse its far-reaching powers. NSDD 145’s ambiguity over which private systems were connected to national security interests meant that the Pentagon enjoyed unprecedented access to civilian communications: commercial databases, computer networks, electronic links, and telephone lines. While nobody involved in the directive’s creation harbored plans for tyrannical domination, the question remained as to what could happen should a president in the mold of Richard Nixon be elected in the future, or an NSA director appointed who was willing to abuse their position in a manner akin to that of former FBI director J. Edgar Hoover. The United States had proven that its highest political offices were not immune to abuse, and as Senator Church had warned a decade prior to NSDD 145, the involvement of an intelligence agency as technologically capable as the NSA in domestic affairs could have terrible future consequences.[93] The government could “totally dominate the flow of all information in the United States,” noted investigative journalist Donald Goldberg in Omni, a popular science and technology magazine. Goldberg conspiratorially quoted an anonymous senior White House official: “Whoever controls communications, controls the country.” [94] Such a concentration of power, warned author and long-term NSA-watcher James Bamford, risked ushering in a “technotyranny.” The preservation of free speech, Bamford reasoned, demanded alternative methods of secure communication remaining beyond the control and oversight of the government.[95]

In an effort to address the myriad of concerns over NSDD 145, Latham testified before a Subcommittee of the House Science and Technology Committee on June 27, 1985. He began his address by clarifying the general climate that had led to the promulgation of the directive. Focusing upon national security, he told the assembled lawmakers: “Virtually every aspect of Government and private information is readily available to our adversaries.” “Unfriendly governments and international terrorist organizations,” the directive’s architect continued, “are finding easy pickings in the flood of unprotected telecommunications and automated data processing information afloat in this country.”[96] In a nod to the private sector, Latham added that the United States was “being bled to death by people taking our technology.”[97] Two days after Latham’s hearing, Reagan made his only public statement relating to the matter in a radio address focused upon counterintelligence activities. “We should begin by recognizing that spying is a fact of life,” the president noted matter-of-factly, primarily referring to Soviet espionage, while normalizing American surveillance too. “We don’t need to fight repression by becoming repressive ourselves,” Reagan reasoned, acknowledging the concerns raised by the opponents of his government’s computer policy, “but we need to put our cleverness and determination to work,” he countered. “There is no quick fix to this problem,” he warned, before extolling NSDD 145’s opponents to “move calmly and deliberately together to protect freedom” without “hysteria or finger-pointing.”[98]

Latham’s appeal to Congress matched the classic definition of “securitization,” namely the process whereby an influential figure seeks to obtain wide acceptance that a specific state-of-affairs—in this case the United States’ cyber insecurity—poses an urgent, imminent, extensive, and existential threat.[99] This approach seemed to be justified two months later when a system administrator named Clifford Stoll, analyzing a minor accounting discrepancy at Lawrence Berkeley Laboratory, inadvertently detected an illegal intrusion into the lab’s network.[100] The ensuing investigation identified that the West Germany-based hackers were paid by a KGB handler to penetrate DoD and contractor networks to steal data on American nuclear technology and the SDI program.[101]

Although the incident marked the dawn of the cyber espionage Latham had warned of, the United States’ response was sl s

uggish and bureaucratic. Instead of launching a counterintelligence operation as would befit Cold War-era spying, the novelty of the crime saw the ensuing investigation framed as a larceny by fraud case. When the four hackers were located and arrested, it was as criminals, not spies. Following the suicide of one of the group, the other three received suspended sentences ranging from 14 months to two years, providing little deterrence to future digital incursions.[102]

In spite of Latham’s spirited defense, the timely discovery of Soviet hacking efforts, and Reagan’s brief intervention, NSDD 145 continued to encounter stiff opposition. The Congressional Committee on Government Operations, chaired by the directive’s long-term critic Jack Brooks, continued to organize hearings over the next two years. Its members listened to testimony from a broad coalition of interest groups ranging from the American Bankers Association to the American Civil Liberties Union, all urging lawmakers to limit the DoD’s scope to intervene in private networks and restrict public access to online information.[103] In the midst of this battle over the future direction of America’s computer security, the Reagan administration became embroiled in the fallout of the Iran-Contra affair. As the president hemorrhaged political capital, Frank Carlucci, the new national security advisor (following the resignation of his predecessor John Poindexter for his role in the scandal), wrote to Brooks in a hope of reaching a compromise over the controversial directive. Carlucci informed the congressman that he had ordered his staff to “review immediately NSDD 145,” as well as instructing them to initiate the procedures to rescind the associated policy directive NTISSP 2.[104] Howard Baker, the president’s newly appointed chief of staff, followed up Carlucci’s letter four days later by assuring Brooks’ of the administration’s intention to “act promptly” on revisions to the directive. “It is my hope,” stated Baker, “that we can proceed in a cooperative fashion to reconcile with your committee whatever differences may remain.”[105]

The platitudes of Reagan’s staff were not enough to appease Congress. The last remnants of trust between the Reagan administration and lawmakers withered as the full details of the Iran-Contra affair emerged. Rather than securing a negotiated compromise over NSDD 145, the weakened president re pudiated the directive. In what amounted to a total rejection of Latham’s vision for the future security of America’s private computer networks, Reagan had to sign Congress’s alternative, set out in the Computer Security Act of 1987. The new bill, sponsored by Brooks’ ally and fellow NSDD 145 critic congressman Glickman, established that federal agencies did not have the authority to monitor or control the use of unclassified computerized information in the private sector.[106] It divided responsibility for securing the government’s networks between the NSA, which retained the classified national security networks; and the NBS, which was assigned responsibility for developing standards and guidelines for federal computer systems. Acknowledging the technical expertise gap between the two organizations, the bill instructed that the NBS was to be able to “draw upon the technical advice and assistance [ . . . ] of the National Security Agency, where appropriate.”[107] The implementation of this proved controversial. In 1988 the NSA signed a memorandum of understanding with the National Institute of Standards and Technology (formally NBS), outlining areas of necessary agency interaction. However, the exact lines of responsibility remained blurred, with the NSA accused of both seeking to overextend its influence in civilian cryptography, while simultaneously withholding technology and expertise from its civilian partners.[108]

The Roots of Cyber Insecurity

The best way to catch a poacher, so the saying goes, is to employ one as your gamekeeper. The same adage applied to Latham’s thinking on placing the NSA at the heart of America’s computer network defense. What NSDD 145’s architect realized, and Congress did not, was that the dividing line between attacking foreign computer networks and defending domestic ones was already becoming obsolete. Only by understanding how to exploit security flaws could those responsible for protecting America’s systems truly know what to protect against, while also providing a degree of deterrence. Two decades after the rejection of Latham’s directive, Michael Hayden, serving as NSA director, argued that separating attack, defense, and exploitation in cyber space made as much sense as America having three air forces – “one for reconnaissance, another for fighters, and a third for bombers.” Ultimately, Hayden concluded, “it was really all about control of the air,” just as NSDD 145 was all about control of cyberspace.[109] While CNE was not formally defined until the mid-2000s, the blurring of offensive and defensive measures it called for was, in many ways, nothing new for the DoD. American spy satellites photographed Soviet installations with the purpose of informing diplomatic negotiations, while also feeding into the Pentagon’s strategic planning. Air Force reconnaissance aircraft would purposefully penetrate Soviet airspace on surveillance flights in order to excite air defense radar and map its capabilities, and Navy attack submarines tapped undersea cables located near Russian naval bases to intercept communications and discover the deployment patterns of Soviet naval vessels.[110] Thus, the offensive or defensive capability and purpose of any DoD asset merges depending upon the context within which it is deployed.

The primary concern Congress held over Latham’s efforts to amalgamate cyber attack and defense was that unlike the pilot’s over-flights or submarine crew’s cable tapping, where the risks were limited to the military personnel on the mission, placing the NSA at the heart of America’s computer networks exposed society to the hazards and perils of military intervention. Despite his agency background, Latham’s directive overlooked the fact that defensive measures remained a poor second to what the majority of Fort Meade personnel saw as their primary purpose—espionage. This created an irresolvable tension for the agency’s engineers, caught between their duty to identify and patch weaknesses in hardware and software, and their desire to exploit them for their own surveillance advantage.[111] The discovery that one of the most disruptive cyber attacks to date—2017’s WannaCry ransomware which affected over 300,000 computers in more than 150 countries—emerged from the theft of one of the NSA’s own security exploits, laid bare Fort Meade’s role in sustaining cyber insecurity, exposing the agency as something very different to the cyber guardian NSDD 145 sought to present it as.[112] Moreover Edward Snowden caused the largest leak of classified material in the history of the United States, taken from the NSA itself, revealing that regardless of an organization’s technical prowess, there will always be what Ware described in 1968 as “human vulnerabilities.” Finally, those very leaks proved that lawmakers’ concerns that the NSA would end up conducting domestic surveillance upon the citizens it was charged to protect were not unfounded. As the Church Committee’s revelations over Operations MINERET and SHAMROCK had already hinted, without strict restraints the NSA is only tempered in its pursuit of data collection by the technological limitations of its engineers. “The NSA is an organization that’s like a hammer,” reflected the veteran government bureaucrat Richard Clarke following his part in the post-Snowden review of the agency’s activities, “and everything looks like a nail.” “If you’re not specific,” he continued, “an agency that bugs phones is going to bug phones.”[113] Seen in this light Congress’s decision to keep Fort Meade out of private networks made absolute sense.

Private sector and academic opposition to NSDD 145 was also perfectly legitimate. In the spring of 1986, Inman spoke at Harvard’s Center for Information Policy Research. Recently retired from government, the former DIRNSA had taken the post of president and CEO of a joint research venture, Microelectronics and Computer Technology Corporation, and shared his thoughts on how best businesses could adapt to the disruption wrought by inevitable technological evolution—what he labeled “the cost of change.” Interestingly, this was precisely what Inman faced during his tenure at Fort Meade. The explosion in affordable computers and modems had seen between five to ten million devices purchased by the general public in the United States by the mid-80s, transforming the economy and society.[114] The NSA’s loss of its monopoly over technological development and cryptology was an inevitable cost of this change. Ironically, for an agency tasked with operating at the cutting edge of the technological frontier, its response, eventually encapsulated in NSDD 145, was deeply conservative. George I. Davida, a computer scientist and one of the first victims of the NSA’s efforts to retain control over technological development through the Invention Secrecy Act, accurately accused the agency of “struggling to stand still, and to keep American research standing still with it.” This, he argued, would allow “the rest of the world” to “race ahead.”[115] Developed nations were on the cusp of the digital age. While presenting new risks, this technological transformation also offered unprecedented promise. NSDD 145 focused too exclusively upon the threats, without consideration of the impact this pursuit of security would have on opportunities. Ultimately, the World Wide Web—a product of the exact sort of openness and collaboration NSDD 145 sought to contain—ushered in the advent of an unprecedented period of global communication and commerce, and with it, American soft power. Technological innovation has long been at the core of the United States’ national security, a paradoxical dependence of prospects and peril. Rather than trying to constrain innovation, Davida offered a much more American solution: “the NSA can best perform its mission in the old fashioned way,” he advised, “stay ahead of others.”[116]

The NSA is the most secretive branch of the intelligence community. While its capabilities made it the ideal technical candidate to defend America’s networks, the unique combination of its opaque nature, history of spying upon Americans, efforts to limit technological innovation, and primary goal of exploiting networks for surveillance raised questions about different layers and concepts of security. NSDD 145’s opponents voiced legitimate concerns that putting civilian and commercial networks under military control risked exchanging the fundamental tenets of the United States as a free society for more secure computer networks. The tensions caused, or in some cases heightened by NSDD 145 between citizens, the private sector, and government forced political leaders to intervene. Congress’s rejection of the directive may well have left computer networks more vulnerable to foreign intrusion than they would have been had the NSA been given free reign. Regardless, lawmakers’ decision to limit the regulatory powers of the American state seems fitting for a president who so vehemently extolled the virtues of a free society over those of its authoritarian rivals. Balancing the freedom, privacy, and civil liberties of their citizens over the security of their online networks is not something the regimes in China, Russia, Iran, or North Korea have to dedicate the same thought to as the United States. And while the cost may be that America’s cyber domain is less secure than that of those nations, it should be remembered that this vulnerability is part of a choice—one based upon the preservation of a much grander idea of a citizen’s security and safety than keeping opportunistic hackers out of a campaign manager’s email account. The battle over NSDD 145 showed that policymakers had just started to grasp the complexities of the new computerized world, but the endurance of America’s vulnerability reveals that the tensions between individual liberty, privacy, and national security are still being resolved.

-----------------------

* I would like to thank the organizers of the “Ronald Reagan and the Transformation of Global Politics” conference, hosted at the Clements Center for National Security, University of Texas at Austin, where an early draft of this paper was first presented. I am also grateful to Professor Kendrick Oliver for his constructive comments on the final draft.

[1] Thomas Baker, “Computers and Political Campaigns: A Look at Technology and Social Change,” Center for Information Policy Research, June 1983, accessed February 20, 2017, .

[2] Donald P. Green and Michael Schwam-Baird, “Mobilization, participation, and American Democracy: A retrospective and postscript,” Party Politics 22 (2016): 158–64.

[3] Marc Ambinder, “How Democrats Won the Data War In 2008,” Atlantic, October 5, 2009, accessed February 27, 2017, ; “All CFI Funding Statistics for the 2008 Presidential Primary and General Election Candidates,” Campaign Finance Institute, January 8, 2010, accessed February 27, 2017, .

[4] Baker, “Computers and Political Campaigns.”

[5] Lee Glendinning, “Obama, McCain Computers ‘Hacked’ during Election Campaign,” Guardian, November 7, 2008, accessed February 27, 2017, .

[6] Malcolm Nance, The Plot to Hack America: How Putin’s Cyberspies and WikiLeaks Tried to Steal the 2016 Election (New York, 2016), 125–26.

[7] Ibid, 144.

[8] “Joint Statement from the Department of Homeland Security and Office of the Director of National Intelligence on Election Security,” October 7, 2016, , accessed December 22, 2016, .

[9] Sean Gallagher, “Officials blame ‘Sophisticated’ Russian Hackers for US Voter System Attacks,” arstechnica.co.uk, August 31, 2016, ; Adam Entous and Ellen Nakashima, “FBI in Agreement with CIA that Russia Aimed to Help Trump Win White House,” Washington Post, December 16, 2016.

[10] “Russia’s Influence Campaign Targeting the 2016 US Presidential Election,” Office of the Director of National Intelligence, 1–5, accessed December 23, 2016, .

[11] “Joint Statement on Reports That Russia Interfered with 2016 Election,” Senate Committee on Armed Services, December 11, 2016, accessed December 23, 2016, .

[12] Nance, The Plot to Hack America, 124.

[13] William J. Burns and Jared Cohen, “The Rules of the Brave New Cyberworld,” Foreign Policy, February 16, 2017.

[14] Jason Healey, ed., A Fierce Domain: Conflict in Cyberspace, 1986 to 2012 (Washington, DC, MD, 2013), 15–16.

[15] John Naughton, “The Evolution of the Internet: From Military Experiment to General Purpose Technology,” Journal of Cyber Policy, 1 (2016): 11.

[16] Ibid., 11; Gregory R. Gromov, “History of the Internet and WWW,” accessed January 2, 2017, .

[17] Michael Kende, “The Digital Handshake: Connecting Internet Backbones,” Federal Communications Commission Office of Plans and Policy, (2000): 4–5, accessed February 27, 2017, .

[18] Raphael Cohen-Almagor, “Internet History,” International Journal of Technoethics 2, (April–June 2011): 52.

[19] Robert H’obbes Zakon, “Hobbes’ Internet Timeline,” accessed March 15, 2017, .

[20] James Curran and Jean Seaton, Power Without Responsibility: Press, Broadcasting and the Internet in Britain (London, 2010).

[21] Thomas R. Johnson, American Cryptology during the Cold War, 1945–1989, Book IV: Cryptologic Rebirth, 1981–1989 (Fort Meade, MD, 1999): 271.

[22] “Preparing for the 21st Century: An Appraisal of U.S. Intelligence,” Report of the Commission on the Roles and Capabilities of the United States Intelligence Community, March 1, 1996, accessed December 27, 2016, .

[23] Johnson, American Cryptology during the Cold War, 271.

[24] Willis Ware, “Security and Privacy in Computer Systems,” RAND, April 1967, accessed December 30, 2016, .

[25] “Security Controls for Computer Systems: Report of Defense Science Board Task Force on Computer Security,” Memorandum for Chairman, Defense Science Board, February 11, 1970, vi., accessed December 21, 2016, .

[26] Ibid., viii.

[27] Healey, A Fierce Domain, 28.

[28] “Steve Walker bibliography,” ; Computerworld, XVI, (June 7, 1982), 122.

[29] Johnson, American Cryptology during the Cold War, 293.

[30] Myriam Dunn Cavelty, Cyber-Security and Threat Politics: US Efforts to Secure the Information Age (New York, 2008), 46.

[31] “Timelime: The U.S. Government and Cybersecurity,” Washington Post, May 16, 2003, accessed December 21, 2016, .

[32] “Counterfeit Access Device and Computer Fraud and Abuse Act of 1984,” H.R. 5112 - 98th Congress, March 13, 1984, accessed January 2, 2017, .

[33] “Keeping the Nation’s Secrets: A Report to the Secretary of Defense, Commission to Review DOD Security Policy and Practices,” 1985, Federation of American Scientists, accessed February 28, 2017, .

[34] William J. Broad, “Computer Security Worries Military Experts,” New York Times, September 25, 1983, accessed February 27, 2017, .

[35] William D. Marbach and Madlyn Resener, “Beware: Hackers at Play,” Newsweek, September 5, 1983, Technology, 42.

[36] Broad, “Computer Security Worries Military Experts.”

[37] Fred Kaplan, Dark Territory: The Secret History of Cyber War (New York, 2016), 19.

[38] Frances Fitzgerald, Way Out There in the Blue: Reagan, Star Wars and the End of the Cold War (New York, 2000), 22–23.

[39] Fred Kaplan, “‘WarGames’ and Cybersecurity’s Debt to a Hollywood Hack,” New York Times, Movies, February 19, 2016, accessed December 30, 2016, .

[40] Scott Brown, “WarGames: A Look Back at the Film That Turned Geeks and Phreaks Into Stars,” Wired, July 21, 2008, ; Ronald Reagan Presidential Library, “Films President and Mrs. Reagan Viewed,” accessed December 30, 2016, .

[41] James Bamford, The Puzzle Palace: Inside the National Security Agency, America’s Most Secret Intelligence Organization (New York, 1983), 427; Anne Brown, “The National Security Agency Scientific Advisory Board 1952–1963,” NSA Historian Office of Central Reference, September 1965, 5, accessed February 16, 2017, .

[42] Kaplan, Dark Territory, 11.

[43] Kaplan, “WarGames.”

[44] “Nomination of Donald C. Latham To Be Assistant Secretary of Defense,” May 17, 1984, in American Presidency Project, accessed March 9, 2017, ; Statement of Donald Latham, “Computer Security Policies”, Hearing before the Subcommittee on Transportation Aviation, and Materials of the Committee on Science and Technology, House of Representatives, June 27, 1985, 31.

[45] NSC-24, “Telecommunications Protection Policy,” February 9, 1979, Jimmy Carter Library, accessed March 1, 2017, .

[46] Jerry Berman, “National Security vs. Access to Computer Databases,” First Principles, 12 (June 1987): 2–4.

[47] Kaplan, Dark Territory, 19.

[48] J. V. Gray et al., “Information Operations: A Research Aid,” Institute for Defense Analysis, September 1997, 68, accessed February 28, 2017, .

[49] “Information Operations (IO),” Department of Defense Directive o-3600.01, August 14, 2006, 9, accessed February 28, 2017, .

[50] “Computer Operating System Vulnerabilities,” Cryptolog 51 (March 1979): 13–16, accessed February 23, 2017, .

[51] Isaac R. Porche III, Jerry M. Sollinger, Shawn McKay, “A Cyberworm That Knows No Boundaries,” RAND, (2011): ix–x, accessed February 23, 2017, .

[52] Clifford Stoll, “Stalking the Wily Hacker,” Communications of the ACM, 31 (May 1988): 489.

[53] “The AT&T Divestiture & National Security,” Cryptolog 91 (August 1985): 10, accessed March 1, 2017, .

[54] Robert J. Hanyok, “Some Reflections on the Reality of Computer Security,” Cryptolog 70 (June–July 1982): 23–24, accessed February 23, 2017, .

[55] Robert Conley, “Data Security in the Information Age,” Center for Information Policy Research, (Spring 1986): 39–41, accessed February 20, 2017, .

[56] Reagan Presidential Library, National Security Decision Directive 145, “National Policy on Telecommunications and Automated Information Systems Security,” White House, September 17, 1984, accessed January 3, 2017, ; Kaplan, Dark Territory, 20; Michael Warner, “Cybersecurity: A Pre-history,” Intelligence and National Security, 27 (2012): 786–88.

[57] NSDD 145.

[58] Latham, “Computer Security Policies”, 29–31.

[59] NTISSP No. 2, “National Policy on Protection of Sensitive But Unclassified Information in Federal Government Telecommunicatons and Automated Information Systems,” October 29, 1986, accessed March 2, 2017 .

[60] U.S. Congress Office of Technology Assessment, Science and Technology and the First Amendment,” Special Report (Washington, DC, 1988), 62–63.

[61] NSDD 145; Latham, “Computer Security Policies,” 6; Kaplan, Dark Territory, 20; Warner, “Cybersecurity,” 786–88.

[62] Johnson, American Cryptology During the Cold War, 294.

[63] Donald Goldberg, “The National Guards,” Omni, May 1987, Federation of American Scientists, accessed December 12, 2016, .

[64] Loch K. Johnson, A Season of Enquiry: Congress and Intelligence (Chicago, IL, 1988), 253, 262.

[65] Ibid., 112–13; Kathryn Olmsted, Challenging the Secret Government: The Post-Watergate Investigations of the CIA and FBI (Chapel Hill, NC, 1996), 104–5; L. Britt Snider, “Recollections from the Church Committee’s Investigations of NSA,” CIA Library, accessed January 3, 2016, .

[66] “Report on Inquiry into CIA-Related Electronic Surveillance Activities,” Justice Department, June 30, 1976, 26–36, National Security Archive, accessed March 2, 2017, .

[67] Bamford, Puzzle Palace, 427, 806–7.

[68] “Church Committee Final Report, Book II, 201;” Latham, “Computer Security Policies,” 3.

[69] Senator Frank Church, NBC “Meet the Press”, August 17, 1975, YouTube, accessed February 23, 2017, .

[70] “Final Report of the United States Senate Select Committee to Study Governmental Operations with Respect to Intelligence Activities,” S.Rep. No.755, 94th Congress, 2d Sess, April 29, 1976 (Church Committee Report).

[71] Sherri J. Conrad, “Executive Order 12,333: Unleashing the CIA Violates the Leash Law,” Cornell Law Review 70 (June 1985): 968.

[72] Americo R. Cinqueqrana, “The Walls (and Wire) Have Ears: The Background and First Ten Years of the Foreign Intelligence Surveillance Act of 1978,” University of Pennsylvania Law Review 137 (1989): 814.

[73] Robert Pear, “Intelligence Groups Seek Power to Gain Data on U.S. Citizens,” New York Times, Front Page, March 10, 1981, accessed February 22, 2017, .

[74] William J. Daugherty, Executive Secrets: Covert Action and the Presidency (Lexington, KY, 2006), 184.

[75] Joseph Persico, Casey, (New York, 1990), 290; Loch K. Johnson, National Security Intelligence (Malden MA, 2012), 263; Rhodri Jeffreys-Jones, The CIA and American Democracy (New Haven, CT, 2003), 229; Richard Immerman, The Hidden Hand: A Brief History of the CIA, (Malden, MA, 2014), 57.

[76] Pear, “Intelligence Groups Seek Power to Gain Data on U.S. Citizens.”

[77] Bamford, Puzzle Palace, 472-473.

[78] John Napier Tye, “Meet Executive Order 12333: The Reagan Rule That Lets the NSA Spy on Americans,” Washington Post, Opinions, July 18, 2014, accessed February 22, 2017,

[79] Staff Report of the Senate Select Committee on Intelligence, United States Senate, Unclassified Summary: Involvement of NSA in the Development of the Data Encryption Standard, April 1978, 3-4, accessed December 5, 2016, ; Warner, “Cybersecurity,” 785.

[80] Bamford, Puzzle Palace, 448.

[81] DoD Directive 5215, “Computer Security Evaluation Center,” October 25, 1982, Federation of American Scientists, accessed March 6, 2017, .

[82] Bamford, Puzzle Palace, 456.

[83] Deborah Shapley, “Intelligence Agency Chief Seeks ‘Dialogue’ with Academics,” Science 202 (October 27, 1978): 407–10; Gina Bari Kolata, “Cryptography: A New Clash Between Academic Freedom and National Security,” Science 209 (August 1980): 995–96.

[84] Ibid.

[85] Robert Creighton Buck, “The public cryptography study group,” Computers & Security 1/3 (November 1982): 249-254, 249.

[86] “Report of the Public Cryptography Study Group,” February 7, 1981, Cryptologia 5 (1981): 130–42.

[87] Christopher Paine, “Admiral Inman’s tidal wave,” Bulletin of the Atomic Scientists 38 (March 1982): 3–6.

[88] Science, Technology, and the First Amendment, 64.

[89] Dan Morgan, “Stolen U.S. Technology Boosts Soviet Strength, Report Says,” Washington Post, November 15, 1982; p.A1.

[90] CIA, “Soviet Acquisition of Militarily Significant Western Technology: An Update,” September 1985, 21–24, accessed March 22, 2017, .

[91] CIA, “USSR Review: Soviet Industrial Modernization,” September-October 1985, v, 37, accessed March 7, 2017, .

[92] Irwin Goodwin, “Making Waves: Poindexter Sails Into Scientific Data Bases, Physics Today 40 (January 1987): 52.

[93] Clarke et al., “Liberty and Security in a Changing World: Report and Recommendations of The President’s Review Group on Intelligence and Communications Technologies,” December 12, 2013, 53–57; Geoffrey Stone, Perilous Times: Free Speech in Wartime from the Sedition Act of 1798 to the war on Terrorism (New York: 2004), 487–500.

[94] Goldberg, “The National Guards.”

[95] Puzzle Palace, 476–77.

[96] Latham, “Computer Security Policies;” Jerry J. Berman, “National Security vs. Access to Computer Databases: A New Threat to Freedom of Information,” National Security and Civil Liberties 12 (June 1987): 4.

[97] Latham, “Computer Security Policies,” 31; David Burnham, “Lack of Security In Computers Seen,” New York Times, June 28, 1985, accessed March 10, 2017, .

[98] Ronald Reagan, “Radio Address to the Nation on Counterintelligence Activities,” June 29, 1985, Reagan Library, accessed February 28, 2017, .

[99] Helen Nissenbaum, “Where Computer Security Meets National Security,” Ethics and Information Technology 7 (2005): 66.

[100] Cliff Stoll, ‘Stalking the Wily Hacker, Communications of the ACM, 31/5, (May 1988), 484-497, 484.

[101] Clifford Stoll, The Cuckoo’s Egg: Tracking a Spy through the Maze of Computer Espionage (New York: 1989); Healey, A Fierce Domain, 30; Warner, “Cybersecurity,” 786–88.

[102] Healey, A Fierce Domain, 29–30; Times Wire Services, “Hackers Guilty of Selling Codes,” Los Angeles Times, February 15, 1990, accessed May 4, 2017, .

[103] House Committee on Government Operations, Committee on Science, Space, and Technology, Computer Security Act of 1987, Report to accompany H.R. 145, June 11, 1987, accessed December 12, 2016, .

[104] Letter from Frank Carlucci to Jack Brooks, March 12, 1987, in Computer Security Act of 1987 Hearings, Computer Security Act of 1987, Hearings before a Subcommittee of the Committee on Government Operations, House of Representatives, H.R. 145, February 25, 26, and March 17, 1987, 386, accessed March 11, 2017, .

[105] Letter from Howard H. Baker to Jack Brooks, March 16, 1987, in Computer Security Act of 1987, 387.

[106] Linda Greenhouse, “Computer Security Shift Is Approved by Senate,” New York Times, December 24, 1987, accessed January 3, 2017, .

[107] H.R. 145, “Computer security Act of 1987,” January 8, 1988, accessed January 4, 2017, ; Johnson, American Cryptology during the Cold War, 294.

[108] Susan Landau et al., Codes, Keys and Conflicts: Issues in U.S. Cryptographic Policy, Report of a Special Panel of the ACM U.S. Public Policy Committee, June 1994 (New York: 1994), 41–45, accessed 12 December, 2016, .

[109] Michael Hayden, Playing to the Edge: American Intelligence in the Age of Terror (New York, 2016), 134

[110] Dino A. Brugioni, Eyes in the Sky: Eisenhower, the CIA, and Cold War Aerial Espionage (Anapolis, MD, 2010), 69; Sherry Sontag and Christopher Drew, Blind Man’s Bluff: The Untold Story of American Submarine Espionage (London: 2000).

[111] Kaplan, Dark Territory, 138–39.

[112] Russell Brandom, “The NSA’s leaked Windows hack caused more damage than just WannaCry,” Verge, May 17, 2017, accessed July 8, 2017, .

[113] Richard Clarke quoted in Lloyd Gardner, War on Leakers: National Security and American Democracy, From Eugene V. Debs to Edward Snowden, (New York, 2016), 195–96.

[114] Latham, “Computer Security Policies;” 4.

[115] George I. Davida, “The Case Against Restraints on Non-Governmental Research in Cryptography: A Minority Report,” July 1981, reprinted in Cryptologia 5 (2010): 143–48.

[116] Ibid, 147–48.

................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download