As technology has given us advanced means of creating, storing and communicating information, it has also made that information more vulnerable. Consider the example of our armed forces. Our armed forces are the most technologically advanced in the world. The Defense Information Infrastructure (the "DII") operates in support of the military's warfighting, intelligence, and business functions. The Department of Defense (the "DoD") is extremely dependent on computer systems to fly, fight, feed and track our troops. The protection of these systems is thus essential to national security.
For example, computerized logistic systems that direct supplies to the appropriate post or base must in time of crisis or war get the right number of bullets or gas masks to the military installation that needs them. If toothbrushes were to arrive instead of bullets, it would obviously have a dramatic effect on a military deployment, exercise or action. Or if a foreign enemy were able to track the movement of such supplies, strategic decisions would lose their confidentiality.
However, over 90% of the DII is composed of unclassified systems. An unclassified computer system is a system in which each individual file on the system is unclassified. While each of the files, individually, is considered unclassified, the unclassified systems contain literally thousands of "sensitive"4 files, including research and development for war fighting systems, intelligence data, troop movement and weapons procurement.
In the days before computer systems this unclassified information was far better protected. Each file was in a file cabinet that was probably locked. This file cabinet would be located in an office that was probably behind a locked door in a government
4 "Sensitive information" is defined as unclassified information "the loss, misuse, unauthorized access to or modification of which could adversely affect the national interest or the conduct of Federal programs, or the privacy to which individuals are entitled" under the Privacy Act (I 5 U.S.C. Section 278g-3(d) (4)).
building that might even have an armed guard. This government building would likely be on a military installation that had a fence and gate guards.
To access all of this unclassified information, the adversary would have to get onto the military installation and into each building, each room and each file cabinet. Then, the adversary would have to somehow remove all of the paper documents or reproduce them without being detected. The DoD would never consider removing its perimeter fences, gate guards, door locks or file cabinets, nor would it consider allowing unauthorized personnel to roam its installations or to have access to its paper documents.
In the virtual world, however, all of these unclassified documents may be located on one server that is connected to virtually any other computer anywhere in the world. An intruder could electronically bypass the installation gate guard, enter the building and, with a few keystrokes, rummage through all of the file cabinets - or only those files needed by using a keyword search -- and then make copies of all of the files and leave without ever being detected.
Once in the electronic files, an intruder could also modify the information. The intruder could install "time bombs"5 that would destroy or change the information at a predetermined time or event. Some might do this as a prank, while others may have a more sinister purpose such as adversely affecting the readiness of military units.
5 A "time bomb" or "Trojan horse" is a hacker technique used to compromise or disrupt systems. It is usually a hidden function in a computer program that the user-victim is unaware of.
It is not merely the theft of information with which the DoD, or any other agency, must be concerned. Our military leaders must have confidence in the accuracy and integrity of their data and information. A changed mathematical formula could alter the flight path of missiles or aircraft. Shifted decimal points in the DoD's finance system could wreak havoc.
Moreover, the DoD must at all times be able to access its information. The destruction or denial of access to certain information could have severe implications for a unit's ability to carry out its mission.
In the physical world, our Defense Department would never allow its information to be at risk in the manner it is in the virtual, electronic world. Senior leaders and managers understand the threats in the physical world, but are only recently discovering the threat in the virtual world.
What is true for our armed forces is just as true for other parts of the government and the private sector. Identifying and addressing vulnerabilities is critical. What then are the major vulnerabilities of our information infrastructure? The Staff has observed vulnerabilities in three main areas: (1) software and hardware weaknesses; (2) human weaknesses; and (3) the lack of a security culture. Each of these vulnerabilities can be exploited to allow intruders unauthorized access to information systems, leaving the information or those systems subject to theft, manipulation, or other forms of attack.
Hardware and software flaws and weaknesses arise from the basic assumption of product developers that all users can be trusted. Rarely is security a major consideration in the research and development of information systems. In addition, the pressure of competition forces companies to field applications as quickly as possible, often without the benefit of comprehensive testing for inherent flaws. The industry relies on users to report product flaws - in turn the industry will either fix the flaw or release a new version of the product. Of course, new versions of products may also have new flaws.
Hackers exploit these inherent flaws and are able to globally disseminate their techniques. The hackers are much better connected and organized and share information about specific vulnerabilities regularly. There are forums for hackers that include physical meetings as well as electronic meetings. Hackers publish glossy magazines where they share vulnerabilities and techniques and trade "war" stories about their individual attacks. Phrack magazine - on-line since 1985 - is one of the most popular of the hacker magazines, providing information to the hacker underground on information about different computer operating systems, networks, and telephone systems.
Hackers also meet regularly on what is called the Internet Relay Channel (the "IRC") for on-line conversations called "chats." Hacking tips and techniques are easily passed through these sessions. In addition, there are well-publicized hacker conventions all over the world during which face-to-face exchanges of techniques are made.
6 "Hardware" is the physical computer equipment; "software" is the program that runs computer applications.
Technology has made it much easier for hackers to exploit hardware and software flaws. In the early 1980s, only very technically competent individuals had the expertise to break into computer systems. Not only were there fewer hackers, there were not as many targets to attack.
This has changed dramatically in the past two years. The proliferation of computers has created a new universe of targets in government, the military and in private industry. Much more of the population has access to computers at work and at home. The vast majority of the people that buy computers today have bundled software packages that give them Internet access .
Similarly, many more people today have the capability to develop hacker tools than fifteen years ago. Colleges, universities and technical schools graduate tens of thousands of computer experts yearly many of whom are highly trained in methods to secure and exploit software program. A small percentage - but nevertheless a significant number - of these people can and are developing tools and techniques to break into the computers and networks of others.
Unfortunately, while the hacker's tools are becoming more and more sophisticated, they are also becoming more user friendly, requiring very little expertise to operate. Point and click technology called Graphical User Interfaces have given anyone with a computer, a modem, and access to the Internet the capability to break into someone else's computer anywhere in the world.
For example, point & click software such as SATAN ("Security Administrator Tool for Analyzing Networks"), which was disseminated on the Internet in April 1995, is a series of hacking tools that can be used by individuals with very little expertise. SATAN scans systems to find network-related security problems and reports whether the
vulnerabilities exist on a tested system without actually exploiting them. Although SATAN was intended for systems administrators and security professionals to analyze their networks for security vulnerabilities, potential intruders use this tool to identify and attack government and private networks.
Rootkit is a series of public domain software tools developed by hackers which allow an intruder to gain root access to networks. Root access is the ultimate access-- that of a systems administrator. Someone with root access can read, alter, or destroy any and all data on a network.
Internet Protocol ("IP") spoofing is a technique used by attackers to gain access to someone's system by masquerading as another Internet system that is trusted by the targeted system. This IP spoofing can also prevent identification of the attacker if the attacker is determined to be an unauthorized intruder by the victim system.
These tools and techniques can be extremely effective. The Defense Information Systems Agency ("DISA") has been performing pro-active electronic "Red Teaming" of Department of Defense systems for over three years. DoD commanders can request and authorize DISA's team of computer security experts to attempt to electronically penetrate their systems. DISA's experts will only attack a DoD system using hacker attack software tools or techniques that are already widely available on the Internet.7
7 Furthermore, DISA in a spirit of fairness, will only use hacker tools for which there is a published "fix" and for which DISA has published an official alert.
As of May 1996, DISA is able to electronically break into 65% of the systems they attack using commonly available attack tools found on the Internet8. What that means is only 35% of our DoD unclassified computer systems are secure. DISA officials have told the Staff that the 65% figure is really a conservative figure. That figure is the result of an average one week dedicated attack against a particular network. These officials report that if they are given more time to attack a targeted network they could probably compromise upwards of 95-98% of the systems.
Another potential vulnerability in terms of software is in the use of commercial off-the-shelf software ("COTS"). Ten years ago software was developed specifically for the government and generally by the government. The government owned the programming code that ran the applications. The government also knew what was in the code. The government knew what the code was supposed to do and exactly what it did. If the government needed changes to the code, it would make the changes or hire a contractor to modify the code.
Today's environment is much different. The government no longer has very many mainframe computer systems that require a specialized programmer code. It is much more cost effective to buy off-the-shelf computer hardware and off-the-shelf computer software packages. The problem with commercial off-the-shelf software is that the software's programming source code is proprietary and usually a trade secret that the government cannot examine. The government only purchases a license to use the
8 This statistic is based on over 30,000 electronic penetrations performed as of May 1996. These statistics have improved over the last two years. Just prior to the Subcommittee's May 22, 1996 hearing, DISA reported they were able to attack DoD systems successfully 88% of the time. The improvement of the statistics may be based on a greater awareness of computer users within the Defense Department, or it may also be based upon changes in DISA's vulnerability assessment protocol.
commercial software. The purchaser knows what they want to use the software for, but may not know everything the software can do. Software packages can include features that are possibly undocumented9 and potentially unwanted.
The typical user is completely dependent on what the vendor provides. As long as the software does what it is intended to do, it is not questioned. What if software purchased off-the-shelf contained a bug that was to be triggered on a certain date and was programmed to change or destroy a system's database? Would government or business be able to recover the information lost? This, unfortunately, is the great unknown that comes with commercial off-the-shelf products.
Perhaps the biggest source of information systems vulnerability are the people who use and manage computer systems and networks. The proliferation of computers and their ever-increasing ease of use has put incredibly sophisticated systems containing very valuable information under the control of millions of people who do not yet grasp the need to maintain security or the consequences of a breach of security.
Often the simplest conduct can create vulnerabilities. Leaving a machine on gives anyone who wanders by access; using easy-to-remember passwords affords intruders easy
9 For instance, in the recently introduced and highly popular Microsoft "Windows 95" operating system, the software contained an undocumented feature - known in the computer field as an "Easter Egg" -- built in that the Microsoft Corporation was unaware of until after production. When using this software application -- which the Staff would emphasize was not sinister and only frivolous -- if you strike a certain combination of keystrokes the names of the Microsoft development team scrolls across your monitor. The data and software for this undocumented feature resides in a very significant number of the world's computer systems and virtually no one knows about it.
opportunities to access systems; leaving a password with numerous office colleagues or writing it on a computer are also security risks.
People's trust is also often a source of vulnerability. For example, a popular feature on the Internet are "chat rooms" in which individuals anywhere on the Internet can join in and communicate with others through text transmission. Chat rooms, however, provide little assurance of the true identity of the participants - they could be a student, business person, computer enthusiast, saboteur, or foreign intelligence agent. Nevertheless, individuals share information with strangers that might include personal information as well as sensitive business or, in some circumstances, classified information.
One such example involves the case of the U.S. Air Force pilot that was shot down over Bosnia. After he was recovered, one of his fighter pilot colleagues went on-line with a very detailed version of the actual recovery of the downed pilot. Much of the information provided in the open Internet forum may have been classified or at least very sensitive. Literally tens of thousands of copies of this fighter pilot's e-mail were read and forwarded to others, including the news media.
The trusting nature of individuals also leaves them open to a hacker technique known as social engineering. Social engineering involves hackers impersonating authorized users, customers, vendors, or others to persuade unwitting authorized employees to divulge critical information such as logons and passwords. Although very "low-tech," this technique continues to reap benefits for hackers. Social engineering exploits the lack of security training and awareness of employees and the emphasis placed on customer service. It is a the computer world's equivalent of the old-style "confidence game."
Another significant vulnerability is the inability of managers who run systems to detected intrusions. Of the DoD systems compromised by the electronic Red Teaming performed by DISA, only 4% of the managers or users of compromised systems actually detected the intrusion. The primary reason systems administrators are not able to detect these types of attacks is the lack of a security culture within government and private industry. Even those entities that take security seriously, though, are hindered by the lack of adequate tools to assist the systems administrators and computer security professionals to detect these invisible crimes.
Of the 4% of the DoD systems administrators that did detect the electronic intrusion by DISA experts, only 27% of the 4% reported the intrusion to the appropriate security or law enforcement agency. Reasons for not reporting can range from not knowing where or to whom to report to being directed not to report due to embarrassment. Commanders are reluctant to report incidents for fear it may negatively affect their careers. This is also true for systems administrators.
Although these statistics are alarming, DoD is proactively identifying and trying to address their systemic deficiencies. Other agencies have no Red Teaming activity or very limited plans to address their own vulnerabilities.10 The Staff conducted interviews
10 The National Institute of Standards and Technology (NIST) recently received an "innovation" grant for $4 million in order to establish, in the future, an incident response team within non-DoD government that would, as part of its duties, conduct vulnerability assessments of government computers. Unfortunately, the response aspect of the team will be on a "pay as you go" basis, so government agencies will pay for its services out of their budgets. This may serve as a disincentive to government agencies to bring their intrusions to NIST. Further, given the enormous amount of computer systems and networks, it is doubtful that the grant will meaningfully address this problem.
with the computer security personnel at numerous government agencies. Most of these agencies quoted the DISA statistics, but few agencies conducted their own vulnerability assessments. Many of the computer security personnel interviewed from non-DoD agencies and departments believed Red Teaming was imperative but generally did not have the resources to perform their own vulnerability assessments.
Computer security professionals lack the resources to address the systemic problems of network vulnerability. In many government organizations, senior managers typically do not understand and, therefore, cannot acknowledge the vulnerabilities of their information systems. As the government downsizes and the private sector struggles to stay commercially competitive, it is inherently difficult to re-prioritize or re-allocate existing scarce resources to a problem that is not defined or appreciated. A candid assessment made by one mid-level information security professional was that absent the "smoking keyboard," managers are not convinced to make the hard choices to take resources from other areas or programs to apply to computer security.
For example, currently in the government there is no Computer Security Specialist Career Field. Personnel are most often assigned the duties of computer security as an additional duty, not as a full-time computer security expert. The additional duty of computer security may be assigned to a non-computer specialist.
Generally, computer security personnel have virtually no computer security experience prior to the assignment and receive very little in the way of computer security training during their tenure. The Staff has found instances of secretaries and administrators being assigned these duties in an office because their computer expertise, although limited, was greater than everyone else's. Often, after two or three years as a computer security specialist, the duty is rotated to another person. This new appointee
will normally not have any background in computer security either. The government continues to rotate these additional duties and completely loses the institutional knowledge it has developed.
Our government has created a climate that is not conducive to fostering security. Clearly, in-depth knowledge and understanding of a very technical subject is a requisite for an information security officer. Unfortunately, specializing in a subject that lacks a career path is a disincentive for employees. If a government employee wants to stay in these specialities they must either accept little prospects for promotion or move from the government to the private sector which is willing to reward specialists in this area with much greater monetary compensation. The end result is a brain-drain of experts from the government to the private sector, which then turns around and contracts the same experts back to the government at a far greater price then if the government gave them career progression in the first place.
In the law enforcement arena the Staff has observed that almost all law enforcement agencies recruit criminal investigators from within their agency and then try to teach them computer technology. Generally, criminal investigators are assigned to computer crime investigations for a two to three year assignment and not as a permanent career choice. The result is a constant turnover of personnel with little to no corporate knowledge, and a constant pool of investigators with little "computer" expertise.
Similar to security personnel, if a computer crime investigator is allowed to stay in the speciality, it may have a negative effect on career progression, as law enforcement favors generalists over specialists.
Based on interviews conducted by the Staff with computer security experts from the private sector, the problem is generally the same outside of government as well. Computer security personnel in the private sector generally do not have a strong voice in the corporate and management decisions. In the private sector the computer security experts are at odds with the business leaders of their companies. Generally, the computer security function is buried in the administrative computer support area of a business. The pressure to automate and connect systems almost always takes precedence over the need to protect.
The Staff's own review of a number of federal agencies confirmed many of these vulnerabilities. For example, the Staff requested from various agencies the name of the individual or office in charge of computer security. Most agencies responded that they did not know who that individual was; or that they did not know if such a position existed; or that the position was spread over numerous departments.
For example, the Staff found that the Department of Justice, though concerned about the security of their networks, takes a decentralized approach to organizing computer security. Within DOJ each component is responsible for its own security. Very few of the components have a full-time security administrator -- usually this task is assigned as an additional duty to a secretary within the component. This is partially due to resource constraints. Typically, security administrators are slotted in the range of a GS-7 to GS- 11. Attracting quality applicants, according to Department officials, therefore becomes a problem. A concern raised by some DOJ officials was that the "pressure to connect" with other networks and the Internet would increase their vulnerabilities.
The lack of clear authority for computer security was particularly acute at the Department of State. A recent Inspector General (IG) audit of the Department's unclassified mainframe security system found that the Department basically had no security plan. As a result, the IG found that the Department was not in a position to even reliably know if information has been compromised. The IG also found that the lack of senior Department management's involvement in addressing authority, responsibility, accountability, and policy for computer security had resulted in incomplete and unreliable security administration.
Inspector General officials also told the Staff that a major threat to the State Department's systems could be from outsourcing computer systems administration to foreign national employees. At foreign posts (with the exception of "critical threat posts"), the Department hires local nationals for computer systems administrators, primarily due to salary constraints. Once hired, these administrators have unlimited access to the post's unclassified computer systems. In Bangkok, for example, the local system administrator designed his own software that embassy employees were using on their computer system. It gave user privileges to everyone regardless of their need for access.
In the Defense Department, the problem of intrusions and attacks into the unclassified but sensitive network is growing with an estimated tens of thousands of successful computer attacks occurring each year.11 While the existence of DISA and its aggressive vulnerability assessment program affirms a level of commitment, a particularly troubling assessment of the Defense Department's treatment of this threat was set forth by the House Committee on National Security in its report on H.R. 3230, the National Defense Authorization Act for FY 1997.
[The] Department is devoting woefully insufficient resources to protecting the Department's information systems.
The problem is a familiar one. Despite widespread recognition of a problem, there are no volunteers to provide funds to correct it. The senior DOD leadership is reluctant to impose a solution to a non-traditional threat, particularly when functional managers and information systems developers present plans that would require funding from outside their own budgets, and therefore entail difficult tradeoffs. In other words, the military services, and the managers of the logistics, medical, personnel, transportation, finance, and other functions within DOD have thus far chosen to maximize capabilities rather than sacrifice capabilities slightly in order to ensure minimum critical requirements are met in wartime conditions.
As a result, over the last two years, the DOD leadership has added only modest resources for information security. The level of funding was not based on a rigorous analysis of requirements, nor were funds limited because advocates failed to make a strong case for additional resources. Rather, the allocation appears to have been determined by the amount of funds that could be easily extracted from the overall budget for command, control, and communications after the normal budget review process.
The potential consequences are that DOD may not be able to generate, deploy, and sustain military forces during a major regional conflict in the event of
11 The recent GAO report, Information Security: Computer Attacks at Department of Defense Pose Increasing Risks, May 1996, GAO/AIMD-96-84, prepared at the request of Senators Sam Nunn and John Glenn, provides an excellent statement of the challenges confronting the Department of Defense.
information warfare attacks on critical support functions controlled by networked computers.
The above language may overstate the extent of neglect in the Defense Department. The Staff would observe that in many ways DoD's self-initiated reviews are the reason for our appreciation of their need to address this issue more meaningfully.
In the Hollywood movie The Net, a hacker electronically breaks into the Bethesda Naval Medical Center (BNMC) computer network to access the Secretary of Defense's medical records and change them to reflect that the Secretary was HIV positive. The Staff contacted a senior Bethesda Naval officer to assess BNMC's actual vulnerability. That official indicated that although some management personnel that did not see a great priority in securing the Center's medical files because they could not imagine why anyone would want to break into them, she had conducted her own vulnerability assessment of the computer system of BNMC. She found that she - and virtually anyone else - could break into BNMC and access and change the medical records of our government's leaders. Since then, BNMC has aggressively and proactively addressed this vulnerability of their records.
The Staff also interviewed officials with the Federal Aviation Administration (FAA) who stated that they were quite confident their systems were relatively safe from intrusion. This is not, they explained, because they have instituted a healthy security program. Rather, they indicated it is because their aircraft control systems are so antiquated and consist of so many separate and incompatible systems, they are more resistant to modern hacking tools. Further, because the current systems, especially power sources, are unreliable, air traffic controllers are prepared to work without computers. Once the FAA upgrades systems, they will be more vulnerable: first, because their
operating systems will be compatible with most other computer systems, including those used by hackers; second, because controllers may become unaccustomed to providing guidance without computer support.
The "pressure" to connect was commonly mentioned by security personnel within
government as a great concern and challenge for the future. Various of these professionals were very
troubled not by current vulnerabilities, but anticipated vulnerabilities that come with greater inter-connections to the Internet and other networks.