acm - an acm publication

Articles

Expect the unexpected

Ubiquity, Volume 2001 Issue October, October 1 - October 31, 2001 | BY John Gehl 

|

Full citation in the ACM Digital Library


Peter G. Neumann talks about out-of-the-box thinking, the events of Sept. 11, and breakfast with Einstein.



Peter G. Neumann ([email protected]), who holds doctorates from Harvard and Darmstadt, is Principal Scientist in the Computer Science Laboratory at SRI International, a not-for-profit research and development institute in Menlo Park, California. After 10 years at Bell Labs in Murray Hill, New Jersey, in the 1960s, he is now in his 31st year in SRI's Computer Science Lab.

Neumann is concerned with computer systems and networks, security, reliability, survivability, safety, and many risks-related issues such as voting-system integrity, crypto policy, and human needs including privacy. He moderates the ACM Risks Forum, edits CACM's monthly Inside Risks column, chairs the ACM Committee on Computers and Public Policy, and co-founded People For Internet Responsibility (http://www.PFIR.org). His book, "Computer-Related Risks," is in its fourth printing.

Neumann is a Fellow of the ACM, IEEE, and AAAS, and is also an SRI Fellow. He is a member of the U.S. General Accounting Office Executive Council on Information Management and Technology, and the National Science Foundation Computer Information Science and Engineering Advisory Board. He has taught at Stanford, U.C. Berkeley, and the University of Maryland. See his Web site (http://www.csl.sri.com/neumann) for more bio info.


UBIQUITY:Let's start with the events of 11 September. The common expression is that they "changed everything." What are your thoughts on that?

PETER G. NEUMANN: The most frustrating thing for me is that I've been talking about threats and risks and vulnerabilities for 25 or 30 years now, and that so many people were asleep at the wheel. We've been talking for years about risks to the critical infrastructures, including vulnerabilities and risks in aviation and transportation, as well as telecommunications, power distribution, pipelines and government continuity. Those vulnerabilities all continue to exist.

UBIQUITY:Would it be unfair to say that there are just such an extraordinary number of conceivable risks that it's impossible to do all-out preparations for each one of them?

NEUMANN: I think that's part of the problem. There are so many risks that people tend to ignore all of them. In fact, some people like to pretend that risks simply don't exist.

UBIQUITY:Why do they do that?

NEUMANN: If they admitted they had a problem, they'd have to do something about it. This is an old problem in computer security. Way back in 1977, I had a meeting in the Pentagon with somebody who was high up in the Air Force, and I said, "You've got an enormous computer-communication security problem." He said, "There is no recognized security problem in the Air Force." I said, "You're crazy." He said, "No, if I were to recognize it I would have to do something about it, and there's nothing I can do about it." This is the stick-your-head-in-the-sand ostrich phenomenon, where if you stick your head in the sand and pretend that the problem doesn't exist, maybe you'll be lucky and nothing will happen. I think that has changed.

UBIQUITY:What else has changed? Have you changed?

NEUMANN: I haven't changed in the sense that I still do the things that I've always done, which is to try to lead a sane and reasonable life that is detached from the financial greed and, if you will, retaliatory stances of many people in this country. My perspective has not really been altered except for the fact that I realize that the crisis that has been latent is now suddenly very real.

UBIQUITY:How would you comment on the different nature of the security issues involved with (a) insufficiently secure software as compared to (b) attacks from outside that have nothing to do with software?

NEUMANN: To say that software has nothing to do with it is a little naive in the sense that computers control all aviation systems. The autopilot is computer-controlled. The air traffic control system is completely computer-controlled. In principle, we could conceive of systems that would, for example, take over an airplane and land it automatically under crisis situations. I've just written a column with my colleague Lauren Weinstein for the November 2001 Communications to the ACM on "Risks of Panic." It's really on the risks of believing in technological solutions to non-technological problems. http://www.csl.sri.com/neumann/insiderisks.html

UBIQUITY:And what is the gist of the article?

NEUMANN: The gist of it is that it is terribly naive to believe that technology is going to solve our world's problems -- poverty, hunger, repressive governments, for example -- because there's much more to it than that. There are many people involved, and people do not follow the assumptions that everybody else might want to attribute to them.

UBIQUITY:Well, the order of analysis needed to cope with security in the age of suicide attackers is certainly different. To use an analogy, if two people were playing chess and person A develops a carefully designed chess strategy, whereas person B doesn't even bother to have a chess strategy, but simply knocks over the board and kills the opponent, that's the kind of risk that is completely beyond the rules of the game.

NEUMANN: That's right. That's out-of-the-box thinking. The trouble is that a lot of folks who are in government or computers or communications think inside the box. The big change is that more people must think outside of the box.

UBIQUITY:We're sure you don't want to condone the targeting of thousands of unsuspecting civilians, so let's turn to some more benign forms of out-of-the-box thinking. In general, how do you help train people to think out of the box?

NEUMANN: I have taught courses that force you to think out of the box. I taught a course at Stanford in '64 that was totally out-of-the-box thinking, and I taught a course twice in 1970-71 at U.C. Berkeley on operating systems that was very much out-of-the-box thinking.

UBIQUITY:What makes them examples of out-of-the-box thinking?

NEUMANN: I don't use textbooks that adhere to a particular narrow solution that says, "If you do step A, step B, and step C, you'll be fine." I force students to think in much more global terms. I taught a course at the University of Maryland in 1999 that was based on a project I did for the Army Research Lab on how to build truly reliable, secure, highly survivable systems and networks. That was clearly one that stepped back and said, "Forget about everything you know about how things are supposed to be. Let's look at how they really ought to be." In my own personal teaching, I've always tried to stimulate out-of-the-box thinking by never letting the students regurgitate what they think they might have learned, but rather by forcing them to think new stuff. The development of Multics (joint with MIT, Bell Labs, and Honeywell, from 1965 to 1969) was certainly an example of radical thinking in computer security. The system that we designed at SRI in 1973 to 1980 called the Provably Secure Operating System was a radical departure from all previous computer systems. It ultimately led Honeywell into the strongly typed security that they've been doing, which then became the Secure Computing Corporation. Many of the projects that I've been involved with in my 30 years at SRI have been out-of-the-box projects, of trying to get away from the stereotypes of how the mass market is going.

UBIQUITY:Can you give an example of one of those projects?

NEUMANN: We did a multilevel secure database management system that was heretical at the time, but now seems like a very logical way to go. I designed, with one of my colleagues, a multilevel secure system in which none of the end user workstations is multilevel secure. It uses highly trusted servers and everything else is not trusted for multilevel security. At the time, it was absolutely radical because it didn't satisfy the standard evaluation criteria of the Department of Defense. Yet it's the only sane way to build that kind of a system. This is traditionally the way all of my research has gone. I'm always looking for far-reaching solutions that are different from the run-of-the-mill stuff.

UBIQUITY:You are certainly different from the run-of-the-mill professor...

NEUMANN: This could be true.

UBIQUITY:... so I'm wondering what percentage of this kind of thinking is science and what percentage is art?

NEUMANN: That's a very good question. I think that I always try to find the best research that's out there. If you look at my 200-page Army Research Lab report on survival systems on my Web site, you'll see I've tried to incorporate all of the good principles, all of the good theory, all of the good past experience, and yet challenge the reader to say, "Don't assume that what you know is complete, because it isn't." Challenge your mind. Step out of the box, if you will, and try to see what new solutions you can come up with that are seriously based on solid software engineering, solid research, and solid theory. Building really good systems that can withstand attacks on their critical requirements and not fall apart on their own simply because they're not robust is an enormous challenge.

UBIQUITY:What is needed to meet the challenge?

NEUMANN: It requires experience. It requires knowledge. It requires a commitment to a goal that is far in excess of what you get out of the commercial marketplace. Remember, I have worked for a not-for-profit research institute for 30 years. That makes a huge difference because I'm not tied to any particular ideology in the sense of technology or any particular dogma in the sense of the way things have to be, or any particular vendor or any particular protocols. If the established protocols are wrong-headed, then we fix them rather than say, "Oh, well, let's slop it through and maybe it'll be OK."

UBIQUITY:You've taught a lot of fine computer scientists. Are they all up for this job?

NEUMANN: Oh, no. Way back in 1968 there was a paper by Sackman, Grant, and Erickson, at the System Development Corporation. This was at the beginning of time-sharing, and they wanted to see whether time-sharing was more effective than batch processing. Time-sharing was very primitive at the time. They set out a bunch of tasks such as giving a programmer a specification and asking him to write the code, and giving the programmer a piece of code with a bug in it and seeing how long it took him to find the bug. The participants were all Fortran programmers who had a couple of years of experience. The study discovered a 30-to-1 difference in the amount of time it took the fastest programmer versus the least fast programmer to discover the flaw in the program. There was a 28-to-1 difference in the amount of time it took the most skilled of the experienced programmers to write the program to satisfy the specification in comparison with the least gifted programmer. So we're talking about 30-to-1 differences in the competence of the programmers. The same thing applies to competence in terms of computer science theory, in terms of programming practice, even among skilled programmers who have many years of experience.

UBIQUITY:How do you explain the competence gap among programmers? Does it relate back to their education?

NEUMANN: When it comes to students, the average students who come out of an American educational system, or probably also an Indian, Chinese or Japanese system, are taught to regurgitate the material that has been crammed down their throats. Again, we get back to the out-of-the-box thinking. Most students are taught by teachers who are victims of a purely left-brain educational system. That is, logical, linear structured learning where you're trying to memorize all kinds of stuff. There's very little emphasis on the right brain thinking -- the intuitive, the creative, the imaginative -- and in particular, the thinking out of the box.

UBIQUITY:You wrote a paper years ago "Zen and the Art of Computing." Tell us about it.

NEUMANN: It made the point that, in order to be a truly creative computer person, whether it be a programmer or a designer or a theoretician, you need to have a strong left-brain component and a strong right-brain component, and you need to be able to integrate the two of them. There are many people who are wonderful left-brained folks. There are people who are tremendously undisciplined right-brained folks. What is needed is the discipline to coordinate the two. If you don't have that coordination and that discipline, you're always going to be either building beautiful elegant little things that might work, or you're going to be building huge systems that don't work. The challenge, I think, is to balance the creativity with the structured. It's extremely rare. I write lots of letters recommending people for tenure promotion or for job applications, and it is very rare that somebody has that balance.

UBIQUITY:Since modern life is getting so complex, do the Luddites and the know-nothings and the people who want to return to the 11th century have a point in saying, "Technology is the problem?"

NEUMANN: Well, they have a small point, which I would think is overwrought. Technology by itself is not the answer to any problems. You've heard quotes such as "If you think that cryptography is the answer to your problem, you don't understand cryptography and you don't understand your problem." Technology is, in fact, full of risks. It's the folks who believe that technology has all the answers that are a very large part of the problem.

UBIQUITY:What about the folks who believe that technology has all the problems?

NEUMANN: Well, the folks who believe that technology has all the problems are also living in a troglodytic world. If you look at the way things have progressed in the last 30 years or so, computers have become a part of life. There are all kinds of things we can do now that we couldn't do 30 years ago. You can ask, "Are all those things good?" Some of them are, some of them aren't. Technology in itself is not necessarily evil, but there are many applications of it that can be considered evil. Anti-technology by itself is not an answer either. Again, we're back to the balance problem.

UBIQUITY:How do you evaluate your fellow computer scientists? I don't mean at SRI, but in the United States, in the world.

NEUMANN: It gets back to out-of-the-box thinking. There are a great many pressures in academia, for example, to write a lot of papers. So you get a lot of papers on microscopically narrow topics that have no applicability in the real world. The "publish or perish" phenomenon does this to a lot of folks. But you've also got professors who are teaching in their own image. The enlightened professors are the ones who are able to give you a realistic view of the technology and the risks involved. There are a bunch of people like that, Jerry Saltzer at MIT, Virgil Gligor at Maryland, Brian Randell in Newcastle, and various others. There are quite a few around but they are, sadly, in the minority.

UBIQUITY:You show on your Web site that you take mentoring very seriously. Tell us about some of the mentors who have been important to you.

NEUMANN: Well, that list is rather extraordinary. Different people have contributed wonderfully different things. Part of the reason I'm as broad as I am, I think, has to do with the fact that I had so many different mentors in different areas who, for some reason, took an interest in me and gave me perspectives just as a result of conversations. There are some musical mentors who meant a great deal to me. There are social mentors who helped in the formation of my socioeconomical, political belief systems. There was my English teacher at Rye High School in New York, Marsden Dillenbeck.

UBIQUITY:Pick one who was especially important to you.

NEUMANN: My little vignette with Einstein meant an enormous amount to me. I had a two-hour breakfast with him in November '52. My mother had done a mosaic portrait of him. So, when I was going to Princeton for the Harvard-Princeton football game, she said, "Please call up his secretary, Mrs. Dukas, and see if you can at least meet him." And Mrs. Dukas said, "Oh, he adores your mother. Why don't you come by and have breakfast?" And so we spent two hours talking about everything imaginable -- structure, complexity, and his quote about "everything should be as simple as possible but no simpler." The ability to talk about music and mathematics and physics with him was extraordinary. We were talking about structure and complexity in Bach and Mozart. I loved Brahms at that point in my life, and I still do, for all of the inner voices and cross-rhythms and complexity that still sounds beautiful and simple. I said, "Tell me, what do you think of Brahms?" And Einstein said, "I have never understood Brahms. I believe he was burning the midnight oil trying to be complicated."

UBIQUITY:That's a great quote.

NEUMANN: I love that quote. Here he was, talking about the importance of making things as simple as possible, but no simpler, and here is Brahms who is making them very complex where you try to play three against two and four against three and the cross-rhythms and the inner voices doing something completely different from the main voices. Brahms is complicated, but it has the appearance of something that's very elegant and beautiful.

UBIQUITY:Just out of curiosity, do you like more modern music?

NEUMANN: Not really. There's some modern music that's fine. Stravinsky is wonderful stuff. But basically I like everything from Gregorian chant to maybe 1950 or so.

UBIQUITY:How do you explain the disconnect?

NEUMANN: I think the disconnect comes from trying so hard to be different from everything that went before it.

UBIQUITY:Do you think that carries over into every kind of human activity?

NEUMANN: I think it does. If you want to write a computer program that's different from everything that's ever been done in the past, you're missing something, because you're going to make new mistakes. And you're going to make all of the old mistakes. There is a great deal to learn from the past. There's a series of quotes from Henry Petroski, whose first book was "To Engineer is Human" he points out that people tend never to learn from their successes, but they have a chance to learn from their failures. The whole history of the Risk Forum goes even further. It says that people don't even learn from their failures. They certainly don't learn from their successes, because there are so few real successes. And the successes are never documented adequately that anybody else can learn anything from them. But the failures sometimes get written up. And we keep making the same mistakes over and over again. We're still stuck with buffer overflows. They've been around for four or five decades.

UBIQUITY:Why don't they learn from failure?

NEUMANN: It's a combination of lack of education, lack of teaching, lack of commitment, lack of programming languages that would avoid some of those problems, complexity, and lack of development tools.

UBIQUITY:What would you prescribe for computer science education?

NEUMANN: Several things are missing in computer science education. One is a real attention to system engineering and software engineering. Another is a lack of understanding of security and reliability. A third is lack of awareness of the social implications: privacy and risks and all of the issues that we've been talking about in the Risk Forum for all of these years. We need a broad-based approach to students who are not merely regurgitating what they've had crammed down their throats, but rather are able to think out of the box. The fundamental thing here is that there are no easy answers, and that, as a society, we always seem to look for easy answers.

UBIQUITY:What else would you like to say as we end this interview?

NEUMANN: There are other mentors whom I would love to highlight in some ways. My friend Dave Huffman who died recently was very special to me. A lot of folks who are still alive and kicking are doing wonderful things, like my friend Fernando Corbato' from the Multics days.

UBIQUITY:Who helped you most in learning to think outside the box?

NEUMANN: Well, Einstein said that whenever he was hung up on a problem he would go walk in the woods or play the violin or do something totally different to clear his mind. I've done a lot of tai chi, for example, which has a very similar effect where you empty the mind, and all of a sudden you become lucid in ways that you hadn't thought about being before. It made a huge subconscious jump in my piano technique, for example! Ted Glazer was another out-of-the-box thinker. He had an enormous effect on me in the early days of Multics because, given the fact that he was blind, one had to learn how to communicate with him much more than otherwise. He was very important to me in terms of getting me to think out of the box.

UBIQUITY:Can you give a specific example of how he helped you learn to think in this way?

NEUMANN: One event that I remember is one summer I'd been up in Massachusetts and we had a meeting for Multics at Bell Labs. I flew down with him from MIT to Bell Labs and rented a car. We were driving back to the airport after the meeting, and a guy behind us started honking his horn like crazy and pointing to the rear right wheel of the car. So, I pull over and checked the wheel and discovered all of the lugs were loose on that wheel. And Ted says, "Check the other wheels." And indeed, all the lugs on all four wheels had been loosened. I tightened them up with a wrench and got back to the airport. While we were waiting in line to turn in the car, a guy was yelling at the car rental agent saying that the interior roof upholstery of his car had fallen down on him while he was driving. It became obvious that they had a psycho who'd been released from the company a few days before and he'd gotten back into the lot and sabotaged cars left and right. So the idea of Ted saying "Look at the other wheels" was real out-of-the-box thinking. Your first inclination is, "Well, gee, maybe this was badly mounted. This was an accident on the part of the guy who changed the tire." Ted immediately thinks, "Maybe it's more than just an accident." That was 1967 probably. That one event, now that you put me up to it, really had an impact. That was, perhaps, one of the things that catalyzed all of my thinking about what could go wrong and trying to, as I say in my book, expect the unexpected.

COMMENTS

POST A COMMENT
Leave this field empty