Articles
John P. Hayes on the next killer app, entangled states, and the end of Moore's Law.
John P. Hayes teaches and conducts research in the areas of quantum computers, computer-aided design, testing and verification of digital systems, VLSI design, fault tolerant computing, and embedded computer systems. Since 1982, he has been a professor of EECS at the University of Michigan. Previously he was on the faculty of the University of Southern California. He also worked in industry for several years, and has held visiting positions at several universities. Hayes is the founding director of Michigan's Advanced Computer Architecture Laboratory and author of several books. He recently received an ACM Citation for outstanding contributions to fault-tolerant computer architecture and to logic design and testing.
UBIQUITY:: What are you most interested in these days?
HAYES: My interests are broadly in the area of computers and digital circuit design, and computer architecture is a special case of that. I've always been interested in how the hardware (and, to a lesser extent, the software) of computers works. I've pursued this at several levels, from integrated circuits at the lowest level up to system architecture, and in recent years I've been pursuing not only traditional architecture but also computer-aided design techniques and quantum computing. I guess design is the basic theme. A secondary theme that I've always been interested in is reliability and testing. What makes computers function -- or I should say, how do computers go bad? What makes them fail and how do you test them after they've failed?
UBIQUITY:: Is there a general answer to the question of what makes computers fail?
HAYES: A general answer is that no matter how reliable the components are, failures will occur. No matter how good the designers are, design mistakes will be made. The solution is to build in enough design redundancy to take care of failures, or have techniques that will check the quality of a circuit or a system, and to detect, diagnose and recover from any failures that do occur. You should assume that failures are inevitable, and include something in the design process to account for them.
UBIQUITY:: Has the nature of the problems changed much with the advent of the Internet?
HAYES: In some respects the problems are somewhat constant, despite the very dramatic changes in technology. They all have to do with complexity. In a sense, the underlying technologies keep getting better, but they're being put into more complex systems. So you are now concerned about failures in the Internet involving thousands of computers, whereas maybe 10 or 20 years ago you would have been concerned about failures in a single, relatively unreliable computer that you happened to be using. We're dealing with much bigger systems so perhaps there are more sources of failure. It's probably the case that software is a bigger part of the equation at this point. Almost anything you do is relying on thousands, if not millions of lines of code. Potentials for failure are very high. Of course, at the same time, even the simplest processor now contains hundreds of millions of transistors. The chances of failure there, too, are non-zero. So, it's a big and complex problem that hasn't gone away. It's changed somewhat in its nature, but fundamentally, people are still very much concerned about failures, testing and reliability issues in general.
UBIQUITY:: Do hardware and software have a dynamic relationship in which one is always getting ahead, and the other is always catching up? Some people have said that functions show up first in software then get embodied in hardware. And sometimes the opposite happens. Is that a true statement?
HAYES: Yes. I think that's a fair statement. There's probably a symbiotic relationship between the two. The long-term trend has been to higher levels of abstraction, both in hardware and software design. Generally, the more complex things have been done in software. So new features and new concepts tend to be implemented in software first, and if they turn out to be very basic and generally useful, they end up -- perhaps years later -- being incorporated into hardware.
UBIQUITY:: Do you follow developments in software very carefully?
HAYES: I wouldn't say I follow them very carefully, but I think a hardware designer should be aware of trends in software. One thing that's become very evident in recent years is that the similarity between hardware and software design has increased. Hardware designers now tend to specify their designs in a hardware description language, such as Verilog or VHDL, which is very similar to a programming language. All of the traditional software problems of program verification and compilation have their counterparts in hardware.
UBIQUITY:: At the University of Michigan, electrical engineering and computer science are in one department. Other places have different configurations. Do you have any particular thoughts on what makes the most sense? Is that where computer science ought to be?
HAYES: I like the structure at Michigan where all of computer science and computer engineering are in the same program, which is called the Computer Science and Engineering Division. We share a larger department with electrical engineering. That has its benefits and its drawbacks. The benefit is that it eliminates the fuzzy boundary between hardware and software. It makes it easier to educate students in both hardware and software, and makes it easier to deal with and discuss the similarities between the two of them. On the other hand, there are organizational and management issues that arise when you have a big department with a diverse faculty and student body.
UBIQUITY:: Have you been particularly surprised at anything that has developed in hardware in the last five or six years? Is the specialty proceeding at a fairly predictable pace?
HAYES: The pace of hardware development is often characterized by Moore's Law, which in its basic form has predicted very successfully for about 40 years that the complexity of hardware would double about every 18 months or so. In recent years we've had some surprises at the pace at which computer performance has increased. It had been increasing at a fairly slow but steady pace, but in the last few years we've seen a dramatic acceleration in basic computer performance.
UBIQUITY:: You mentioned quantum computing as one of your primary interests. How do you see that developing?
HAYES: It's a totally new paradigm for computing. We never thought of computer science as having a foundation in physics beyond the physical devices that hardware is constructed from. Now we see that classical computer science and computation as such are based on classical physics. There's a different form of computation that can be based on quantum physics. That turns out to be a very rich area, at least from the theoretical viewpoint. It's particularly exciting because -- although it's at a primitive stage of development -- it has a killer application. Namely, there is an algorithm for factoring numbers that is exponentially faster than any known classical algorithm for that problem. Quantum computing raises the possibility of very fast factoring, which has an immediate application in code breaking. If we had fast or practical quantum computers, we could crack all the codes that are in use on the Internet right now. And that would have some interesting consequences.
UBIQUITY:: Is "interesting" a strong enough word?
HAYES: Well, for sure, the consequences would be very dramatic. As a result, there is a lot of interest now in both the theory and practice of quantum computing. It's extremely difficult to build quantum devices for this type of computation. So, we're still a long way from having anything that would be competitive with more conventional technology.
UBIQUITY:: How far off do you think it is? Ten years, perhaps?
HAYES: That's a fair guess. The optimists hope to see quantum processors of some kind within ten years. The pessimists are looking out much further than that. I tend to side with the optimists. The basic phenomena have been demonstrated in the lab and small prototype quantum machines have been built successfully. Furthermore, we're much closer to having applications in communication rather than computation. I think we'll see practical applications of quantum communication circuits within a few years.
UBIQUITY:: Let's concentrate for a moment on the code-breaking aspect. Are you suggesting that at some point in the future just about any code in the world could be broken?
HAYES: I'm suggesting that the existing codes that depend specifically on the difficulty of obtaining the prime factors of large numbers would become vulnerable if we had practical devices that could solve the factoring problem quickly. But I wouldn't worry too much about it. UBIQUITY:: Why not?
HAYES: Well, for one thing, it's a long way off. For another, quantum mechanics offers some novel techniques for cryptography that may provide even more secure methods for encoding information. In particular, the phenomenon of entanglement of quantum states is being explored as a technique for a very secure type of communication and that's fairly far advanced.
UBIQUITY:: Is there a simple way of explaining what you mean by entanglement of states?
HAYES: Quantum states can be entangled, which means that there's a dependency relation between, say, two bits of information that are entangled. That relationship is independent of their separation in space. So, if two people have a shared pair of entangled bits, it's possible for them to communicate in ways that are very different from the classical. Furthermore, if they do communicate and there's an eavesdropper, the eavesdropper has to perform a quantum measurement of some kind, and that has the effect of disturbing the states of the quantities being measured. This goes back to the Heisenberg Uncertainty Principle, which says, roughly speaking, that measuring a quantum state can affect that state. With this kind of arrangement, it's possible that an eavesdropper on a quantum communication channel will be detected by the people who are communicating. That's an inherent property of the communication mechanism that doesn't have a counterpart in classical communication.
UBIQUITY:: What would advances like these in quantum computing do to Moore's Law, which is said to be nearly exhausted by now.
HAYES: The projections are that Moore's Law will come to an end maybe in the next 15 or 20 years, although most projections of the demise of Moore's Law have proven to be premature. My guess is that sometime around 2020, transistors will be getting down to the dimensions of an atom. Nobody's quite sure what will happen to transistors at that point. Actually, quantum computing and other forms of so-called nanotechnology are seen as probably the next wave that will or could take over after the current transistor technology runs out of steam. The reason is that quantum devices operate at the atomic level, so they're inherently very small devices. If we could fabricate the kinds of circuits that we would like to, they would be essentially at the atomic level. In principle they could achieve component densities, which would perhaps take off where Moore's Law and conventional technology stop. They're very good from the viewpoint of high density and high speed. That's one of the motivations for current interest in quantum technology.
UBIQUITY:: What will biology contribute to computation?
HAYES: That's another interesting direction, which also offers the possibility of extremely dense computing. At this point, it doesn't offer any non-classical type of technology and it doesn't seem to offer very high speed. But it does offer small size and the possibility of having very dense devices, which again can go beyond what conventional IC technology can do. So there is a lot of interest in various forms of biology-inspired devices. My observations are that these are in the early stage of development, but certainly are interesting and could have a big long-term impact.
UBIQUITY:: Explain, if you would, why you regard quantum systems as non-traditional but you don't regard biological systems as non-traditional?
HAYES: I should qualify what I'm saying. The biological technologies that are being investigated are certainly non-traditional from a physical viewpoint. They are very different from silicon-based circuits, but they're based on classical physics, at least up to this point. What is different about quantum computing is that it's based on non-classical physics, namely quantum mechanics.
UBIQUITY:: Talk a minute about software. Is there anything equivalent on the software side to Moore's Law?
HAYES: Well, a common feature of these developments is the movement to more abstract levels of design. One way this is achieved is by the building block approach, where more complex components are used to construct systems. This is driven by the complexity of the design problem, which is common both in hardware and software. If you want to call it a generalization of Moore's Law, there is a fairly steady movement towards increasing levels of complexity. It's been partly enabled by the building block approach where components are designed to be reusable. In a sense, this is nothing new, although it's been getting a lot of attention on the hardware side recently under the name of systems-on-a-chip where there's a strong focus on complex building blocks, which may come from multiple sources and vendors. These are being assembled into even larger systems. The level of abstraction has moved from machine language to assembly language to high-level programming languages and probably now more to application packages, which are yet higher levels of abstraction.
UBIQUITY:: If one were looking for a general theory of complexity -- or a general theory of simplicity -- would one more likely find it on the hardware side or the software side? Or would there need to be two general theories?
HAYES: I'm not sure that there's a useful general theory for both sides. If you become too general or too abstract then you lose the specific knowledge and specific techniques that drive hardware on one side and software on the other. I see certain common aspects. One that I find quite striking is the increasing use of hardware description languages, which as I've suggested make hardware design look similar to writing programs in a high level language. The languages are different, but they have many similarities. I think as a result of that, hardware designers need to be familiar with software design techniques and programming, not just so they can write programs that will run in hardware, but so they can actually design the hardware on which traditional software executes.
UBIQUITY:: Are intellectual property issues much of a problem in your work?
HAYES: I would say that they do loom as a problem in the hardware design industry and they have implications that are technical, legal and commercial. They haven't had a lot of impact as yet on education, for example, but under the heading of the system-on-a-chip design, I think we're starting to see a design methodology that is developing that so far doesn't have a unifying theory, but is probably going to become a bigger part of the work the design engineer is doing in the future. I think that's probably going to have some long-term impact. In some ways it will make hardware design closer in spirit to system-level software design.
UBIQUITY:: You've also been a teacher, as well as a researcher. Is there any issue or idea that is particularly important to make sure students understand?
HAYES: I think that theoretical foundations have changed slowly and have continued to be very important. I'm thinking of mathematics, hardware, electric circuit theory and basic digital design concepts. I encourage my students to get a good grounding in fundamentals, because I would argue that that's something that stays with you over the long term. The more applied concepts and ideas tend to change much more rapidly.
UBIQUITY:: Do you find that students resist or find it hard to appreciate the necessity of concentrating on fundamentals?
HAYES: I think some find it hard to appreciate, and tend to feel that as scientists or engineers the more theoretical side of the curriculum is not relevant. But I would argue that it is relevant and nothing beats a solid foundation in the basics.
UBIQUITY:: Thinking back on your own career, has it gone exactly the way you expected it to go, when you were a young man in Dublin? Have there been any major surprises?
HAYES: I came to this country because I wanted to pursue the computer field and at that time there were no opportunities in Ireland. I originally wasn't planning on an academic career, which is why I spent a couple of years in industry, but that convinced me that I wanted to pursue academe. I've certainly no regrets for doing that. Have there been major surprises? I think there have been lots of unexpected developments in the field of computer science and in technology. They've made this particular business very interesting. Just about every five to ten years or so something new and dramatic comes along that sends me scurrying back to the books, back to the classroom in a sense as a student to learn new things. I think that makes for an interesting career. I have no reason to believe that that is going to end anytime soon. So quantum computing is just the latest manifestation of that.
UBIQUITY:: Five years from now, will you find something else of interest to you?
HAYES: No doubt I will. And it will be something that, at this point, we probably don't know anything about at all.
COMMENTS