acm - an acm publication

Articles

Risks in our information infrastructures

Ubiquity, Volume 2000 Issue May, May 1 - May 31 2000 | BY Peter G. Neumann 

|

Full citation in the ACM Digital Library


The tip of a titanic iceberg is still all that is visible.



Written testimony, for the U.S. House Science Committee Subcommittee on Technology, hearing on 10 May 2000, introduced into the record by Keith Rhodes of the General Accounting Office on my behalf.

This is my third round of testimony for your committee on the subject of computer-communication security risks, following earlier testimony in 1997 and 1999. My 1999 testimony was less encouraging than in 1997, which in turn was less positive than the state of the art in 1995 (Peter G. Neumann, Computer-Related Risks, Addison-Wesley, 1995). To summarize the most recent 12-month period, the situation has in many respects again become even worse -- relative to the increased threats, risks, and system/network vulnerabilities. System security and dependability are still seriously inadequate. Vastly many more people are now relying on the Internet, and most of them are oblivious to the risks. Overall, the situation is grave. The commercial marketplace is not leading. The Government is not exerting enough driving forces. This is a really ridiculous predicament, and would be a very bad joke if it were not so serious.

A year ago, your committee's concern was with the Melissa e-mail Trojan-horse virus, which spread rather widely among Microsoft Outlook users via Word macros. This year it is the ILOVEYOU e-mail Trojan horse that propagated itself even more widely, using executable Microsoft Outlook scripts. Cost estimates of ILOVEYOU are already in the billions of dollars. Ironically, these attacks required minimal technical sophistication, and copycat attacks have subsequently been essentially trivial to perpetrate. Clearly, any executable content in an e-mail message on a system without significant security precautions represents a threat, and yet people blindly invoke dangerous attachments using underprotected operating systems.

Moral: The e-mail of the PCs is more deadly than the mail (USPS).

A fundamental observation is that the underlying technology is riddled with security vulnerabilities, and not enough effort has been devoted to improving the technology. The risks go far beyond what has happened with Melissa and ILOVEYOU, which were relatively crude attacks. Annoying e-mail problems are small peanuts in the light of an endemic lack of security and reliability in computer-communication systems overall. It is important to realize that the damage could easily have been much greater -- for example, with massive system crashes that could have hindered subsequent reboots in the absence of extensive human intervention, or coordinated unfriendly-nation attacks on our national infrastructures. Much subtler attacks are also possible that might never be detected until it is much too late, such as planting monitoring Trojan horses, routinely stealing sensitive information, and systematically compromising backups. Considering the widespread deployment of seriously flawed commercial off-the-shelf proprietary software throughout the U.S. Government, Department of Defense, electronic commerce, and elsewhere, the risks of much greater devastation are widely recognized in technology communities. Depending on inherently undependable software is clearly an invitation to disaster: the scale of potential threats is enormous, and unlike anything that we have seen yet. Besides, many brittle software systems seem to fall apart on their own, without any attacks. Also, in all too many cases, developments fail before completion (FAA, IRS, etc.).

Moral: Never send a child to do an adult's job. Mission-critical systems need highly disciplined development and operation.

These problems existed long before the Y2K remediation, and need to be addressed urgently. In a real sense, information system insecurity is an even more pervasive problem than Y2K was, and the necessary fixes are more far reaching even though there is no fixed deadline as in Y2K. Perhaps the nation's biggest challenge in this area is to gain a pervasive awareness of the problems and to create some real incentives for remediation. Many approaches exist in the research and development communities, but unfortunately they are not finding their way into practice. The attackers increasingly have huge leverage, where a small amount of effort can cause major problems, including billion-dollar losses by ill-prepared e-business infrastructures and outages in the national infrastructures. It would be prudent for us as a society to invest much more in better protection and detection methods and to make robustness and security top criteria in research, development, and acquisition decisions related to information technology.

Moral: An ounce of prevention is worth a pound of cure.

There is a serious risk in monocultures. When many different systems come from the same developer, they all tend to exhibit the same or similar types of vulnerabilities. In such cases, the devastation from attacks is even greater than in the presence of diverse approaches. But there are also dangers in incompatible multicultures, and thus a culture of cooperative interoperability is highly desirable. The open systems and source-available movements have each made some promise in this direction. Thus, competition is healthy -- particularly if there are incentives to interoperate. In a monoculture, there are no such incentives.

Moral: Don't put all your eggs in one basket. But don't have too many baskets that have to be coordinated.

Some people believe there are tradeoffs between system features on one hand and dependable robustness on the other hand -- for example, security, reliability, and survivability in the face of all sorts of adversities. The mass-market marketplace is overly concerned with features; it tends to be long on fancy features and to ignore critical requirements such as rudimentary robustness. However, robust features can be achieved with good design and good programming practice, rather than the business-as-usual practice of sloppy development and a rush-to-market mentality. If automobiles were recalled as often as computer system flaws are detected, we would still have horses and buggies.

Moral: Until software developers and system purveyors are liable for the failures of their products, there will be no real motivation to develop robust systems. It is not just the "hackers" who are at fault.

Despite the increased threats and realistically increased perceptions of risk, there have actually been some advances in the technology. Some people in the U.S. Government have increased their awareness -- for example, as a result of the distributed denial of service attacks, various e-mail fiascos, and the Wen Ho Lee and John Deutch cases. The Department of Defense is slowly beginning to take information security more seriously, particularly with respect to public-key infrastructures. The Defense Advanced Projects Agency has an important R&D program in information assurance, with various defensive aspects of information warfare. However, government funding of far-sighted research and development is neither adequate nor directed precisely enough toward high-payoff advances. On the whole, research on dependable critical systems is not finding its way into the marketplace. As a result, the commercial information infrastructures on which we all depend are not adequately sound.

Moral: The long-term costs of not investing properly in the future can totally swamp any short-term savings from not doing the right thing now.

So, what can be done? There are many short-term and long-term steps that need to be taken. Mass-market software developers might need to seriously reconsider the lack of security and robustness in their systems and networks; some new approaches are absolutely essential, particularly with respect to system security, authentication, and accountability. The Government needs to revamp its procurement process for systems and networks, insisting on better development practice and better products, and perhaps assessing penalties for noncompliant systems and insisting on liabilities for disasters. The R&D funding process is also in need of improvement, with more well integrated coherent long-term efforts. Overall, Congress needs to focus less on short-term legal solutions and more on long-term problems that inspire technology, education, awareness, and social changes.

The Science Committee should be most interested in encouraging further technological improvements that are desperately needed: for example, security and robustness in operating systems, applications, networking, cryptography, authentication, accountability, source-available software that can through collaborative efforts be more readily improved than can closed-source software, and systems that are easier to administer sensibly. You should also be considering pervasive educational efforts, certification of programmers and companies that develop critical systems, and incentives for better progress -- together with financial disincentives for poor performance.

Moral: Remember the tortoise and the hare. We must have the foresight to develop and procure systems that are needed to meet stringent critical requirements. Planning for the long run is essential.

In contrast, Congress and legislatures need to resist the desire for short-term fixes that in the long run are likely to be counterproductive. One horrible example is the Digital Millennium Copyright Act, which appears to criminalize some fairly commonplace computer security techniques, such as reverse engineering used in an effort to eliminate security problems. Another example is the Uniform Computer Information Transactions Act (UCITA), which is working its way through state legislatures; among other things, it absolves software developers of liability for consequential damages resulting from bad software.

The House Judiciary Committee will undoubtedly continue to seek legal measures. However, given fundamentally deficient systems, legal measures are always going to be limited in their usefulness -- especially considering the difficulties in tracing sophisticated attacks on lame infrastructures mistakenly used for critical applications, and especially when those attacks are launched by perpetrators in unfriendly countries or by suitably masquerading insiders. Laws may provide a few deterrents, but better systems are more fundamental.

Moral: Building more jails and trying to arrest more "hackers" is not an effective strategy for prevention.

My participation in the General Accounting Office Executive Committee on Information Management and Technology has been fascinating. I hope that you will continue to allow the GAO to provide Congress with realistic evaluations of what is wrong, what needs to be done, and how to do it. Their work in the past has been vital. The problems that must be confronted are severe and urgent. As Yogi Berra might once have actually said, "It gets late early."

Expanding on this brief testimony, my Web site http://www.csl.sri.com/neumann contains numerous examples of what has gone wrong in the past and lessons that must be learned, plus detailed considerations of how to develop and configure systems that are substantially more secure, reliable, and survivable. Some of that background may be useful to you and your staff, and I hope you will pursue it. See also http://www.pfir.org for some position statements on relevant issues.




Peter G. Neumann is a Principal Scientist in the Computer Science Laboratory at SRI (where he has been since 1971), concerned with computer system survivability, security, reliability, human safety, and high assurance. In addition to his research and his book, Computer-Related Risks, Addison-Wesley, 1995, his involvement in assessing and responding to risks also includes moderating the ACM Risks Forum (comp.risks), and chairing the ACM Committee on Computers and Public Policy. This testimony is available on-line at http://www.csl.sri.com/neumann/house2000.html.

COMMENTS

POST A COMMENT
Leave this field empty