acm - an acm publication

Articles

An interview with Melanie Mitchell
On complexity

Ubiquity, Volume 2011 Issue April, April 2011 | BY Richard T. Snodgrass 


Full citation in the ACM Digital Library  | PDF


Ubiquity

Volume 2011, Number April (2011), Pages 1-6

An interview with Melanie Mitchell: On complexity
Richard T. Snodgrass

Melanie Mitchell, a Professor of Computer Science at Portland State University and an External Professor at the Santa Fe Institute, has written a compelling and engaging book entitled Complexity: A Guided Tour, published just last year by Oxford University Press. This book was named by Amazon.com as one of the 10 best science books of 2009. Her research interests include artificial intelligence, machine learning, biologically inspired computing, cognitive science, and complex systems.

[Mitchell received her doctorate in 1990 at the University of Michigan under Douglas Hofstadter. She has been on the faculty of the Santa Fe Institute and the Oregon Graduate Institute; she joined Portland State in 2004.]

In this interview we revisit some of the topics of this book and the research that Melanie is engaged in.

Ubiquity: A simple starting question: What is "complexity"?

Melanie Mitchell: I would call this a "deceptively simple" question—in fact, it is one of the most difficult questions of all! The field of complexity arose out of the strong feeling of some scientists that there are deep similarities among certain highly "complex" systems in nature, society, and technology. Examples of such systems include the brain, the immune system, cells, insect societies, economies, the World Wide Web, and so on. By "similarities," I don't mean that there are necessarily a single set of principles that governs these disparate systems, but rather that all these systems exhibit behavior that has been described as "adaptive," "life-like," "intelligent," and "emergent." None of these terms have precise meanings, yet, which makes a formal definition of "complex system" impossible at this time.

ins01.gif

One informal (and somewhat circular) definition of complex system is the following: A system with large numbers of interacting components, in which the components are relatively simple compared with the system as a whole, in which there is no central control or global communication among the components, and in which the interactions among the components gives rise to complex behavior. Here, "complex behavior" refers to the informal terms (e.g., adaptive, emergent) that I listed above.

Ubiquity: Is there a science of complexity?

MM: I think of the field of complexity as a loose grouping of different disciplines that study such complex systems and seek to elucidate common principles among these systems. More than 100 years ago, the philosopher and psychologist William James said psychology is not yet a science but rather "the hope of a science." I think the same thing can be said today of complexity. I personally try to avoid the term "complexity science" and instead use the term "the sciences of complexity."

Ubiquity: The Santa Fe Institute (SFI) was created in 1984 as a center for the study of complex systems. How did you become involved with this institute?

MM: I was a graduate student in computer science at the University of Michigan, doing a Ph.D. under the supervision of Douglas Hofstadter. I took a course on "genetic algorithms" from Professor John Holland, who was one of the early members of SFI. I became very interested in this area of research, and Holland invited me to spend a summer visiting SFI. I got hooked on the place, and was very interested in finding a way to return there. A year or so after my summer visit, I received my Ph.D. It was then that a position opened to direct SFI's program on "Adaptive Computation." John Holland again recommended me for this position, and I ended up going there full-time and eventually joining the faculty of the Institute.

Ubiquity: What brought you to Portland State University?

MM: At the time I was there, the Santa Fe Institute's faculty positions had a non-renewable five-year term-limit. After my term was up I obtained a faculty position in computer science at the Oregon Graduate Institute in Portland, a city that my husband and I wanted to move to. OGI had recently merged with the Oregon Health and Science University. During my second year at OGI, due to a number of changes in OGI's mission, most of my department, including myself, was invited by Portland State University to move across town to join their CS department. It was a very interesting, unprecedented "merger" of the two departments, and doubled the size of PSU's CS department.

Ubiquity: We'll get into computer science in a minute, but first, can you summarize the other disciplines that have contributed to the sciences of complexity?

MM: There are really a lot of different disciplines, and the list is growing! I would say nearly all the major branches of science have something to contribute to complex systems research, as do the major branches of social science, as well as history and philosophy. I'm not sure where to draw the boundaries.

Ubiquity: Your book examines in detail the interaction between biology and complexity and making connections between biology and computation. What role does information processing play in living systems?

MM: Answering this question could take up an entire book (and it is one I hope to write someday!). My very short answer is that information processing is an alternative way of describing the behavior of living systems; that is, alternative to what is typically discussed in the biology literature. The framework of information processing, or computation, can help unify some of the disparate features we see in living systems. In Complexity: A Guided Tour, I draw parallels among ant colonies, the immune system, and cellular metabolism by looking at their behavior as "computational"; this allows me to propose some common principles of information processing in these very different biological systems. However, it must be said that the whole notion of "information processing" or "computation" in living systems is still rather fuzzy—many people use such terms to describe biological phenomena, but there is generally no agreed-upon set of definitions or formalisms in this area. So often it's hard to get a clear picture of what people are talking about.

Ubiquity: What does our field of computer science have to offer to this general area?

MM: There are many contributions to be made by computer science. Computer simulation is a central methodology for studying complex systems. High-performance computer modeling tools and methods are absolutely essential for making progress in this area. But, in my opinion, computer science has an even more fundamental role to play by offering formal frameworks for thinking about information processing in nature. A hallmark of complex and adaptive systems is the ability to process information in sophisticated ways. For example, the notion of information processing is being used increasingly in all areas of biology as a framework for understanding adaptive behavior.

I believe that computer science, and more generally, the foundations of computation theory, can eventually offer ways to formalize these as-yet informal and hazy notions of information processing. I personally believe that information processing will come to be seen as one unifying framework for understanding living systems.

Ubiquity: Your doctoral dissertation involved a computer program you wrote that makes analogies. That's pretty neat!

MM: Thank you. This is a topic I am still working on.

Ubiquity: What does making analogies have to do with complexity?

MM: Making analogies is central to most aspects of cognition. I don't have the time or space to elaborate on this much here, but I can recommend my book Analogy-Making as Perception, and Douglas Hofstadter's Fluid Concepts and Creative Analogies, as well as his upcoming book (with Emmanuel Sander), Surfaces and Essences, all of which describe the many ways in which analogy-making resides at the core of intelligence. The relationship with complexity is in the model Hofstadter and I developed—how people make analogies. This model involves some of the same key adaptive information-processing mechanisms seen in ant colonies, the immune system, and other complex living systems. There is a chapter in my latest book, describing this connection.

Ubiquity: Where does the "science of networks" fit?

MM: The science of networks is an attempt to study networks across different disciplines and to identify common principles and methodology. Networks have of course been studied for hundreds of years in different guises: The field of graph theory is devoted to the mathematical study of networks; social networks have been studied by sociologists and social psychologists; technological networks such as the electrical grid and the internet have been studied by engineers; food webs by biologists; genetic regulatory networks by geneticists; and so on. However, for the most part, these various disciplines did not communicate with one another. More recently, people have been finding interesting commonalities among these disparate networks. For example, in the late 1990s, Watts and Strogatz gave a mathematical definition of a network property known as the "small-world" property, and showed that the electrical grid, the neural network of C. elegans (a type of worm), and a social network of actors all shared this property. The small-world property has been studied extensively since then and been found in networks across many disciplines.

Also in the late 1990s Barabási and Albert put forth the notion of "scale-free networks" (in essence, networks with a fractal structure) and showed that many different natural and technological networks have a scale-free structure. Since many (perhaps all) complex systems can be usefully viewed as networks, in which individual agents (nodes) communicate (are linked) with limited numbers of other agents, the interdisciplinary study of networks has the potential to uncover many insights on complex systems in general.

Complexity: A Guided Tour has several chapters about the content and impact of the science of networks. Barabási's book Linked and Watts' Six Degrees both give good overviews of this emerging science. A more technical introduction is the recent book by Newman, Networks: An Introduction.

Ubiquity: There has been a question of whether "scale free" applies to the Internet. David Alderson, in a Ubiquity interview from 2009, said that the Internet, and many of its subsystems like the Google cloud, is purposely engineered to be more resilient to failure than the scale free model would predict.

MM: Alderson's examples (Traceroute data, Google servers) refer to the Internet (i.e., the collection of servers and communication links), which is quite distinct from my previous example of the World Wide Web (i.e., the logical structure created by hyperlinks), which has been claimed by a number of people to be scale free. But the larger question is valid; in general it is very difficult to determine whether or not a large network is actually "scale free" or has some other structure. To be very precise, the term "scale-free" refers to the mathematical property of having a power-law degree distribution. In the real world, this property can only be satisfied approximately, so there are no completely "scale-free" networks, just as there are no perfect fractals in nature. So the question is: to what approximation can we say the network is scale-free, and is this approximation useful in understanding the network's behavior? This has been the subject of considerable debate in the network science literature, but there have been many empirical studies that support the (approximately) scale-free nature of the World Wide Web, among many other natural and technological networks.

Ubiquity: In your opinion, what should the overarching goal be for the sciences of complexity, including computer science?

MM: I think there are two related goals, both quite open-ended.

The first is to discover common principles underlying different complex systems that yield insight into these systems and yield new methods for analyzing such systems. One example of a common principle is the notion of scale-free networks, which we just discussed. And as one example of a novel analysis technique, biologists have adapted Google's PageRank algorithm (a computational method that exploits the scale-free structure of the World Wide Web) in order to study the importance of different species in food webs and to better understand extinction risks. (An article on this work can be found at PLoS.)

A second, perhaps more ambitious goal, is to develop a mathematical theory that describes complexity in a general way and that can explain phenomena and make predictions across many different systems. For example, with such a theory it would be possible to nail down, in a formal way, the underlying mechanisms of dynamics, adaptation, collective decision-making and control, and "intelligence" that are shared by insect colonies, economic systems, the brain, and other complex systems. Such a theory would presumably bring together several different strands of theoretical work in dynamical systems theory, computation theory, statistical physics, stochastic processes, control theory, decision theory, and other fields. It is an open question as to whether such a theory even exists, not to mention what it would look like.

The mathematician Steven Strogatz has termed this goal the quest for a "calculus of complexity." The analogy is apt in some ways: Newton, Leibniz, and others were searching for a general theory of motion that could explain and predict the dynamics of any object or group of objects subject to physical force, whether it be earthly or celestial. Before Newton's time, there were individual pieces of such a theory (e.g., the notions of "infinitesimal," "derivative," "integral," etc. existed) but no one had figured out how all these pieces fit together to produce a completely general theory that explained a huge range of phenomena that were previously not unified. Similarly, today, we have many different pieces of theory related to complex systems, but no one has yet determined how to put them all together to create something more general and unifying.

Ubiquity: Do you think such a calculus of complexity will in fact emerge?

MM: A general theory like this was the holy grail of the cybernetics movement of the 1940s and 1950s; in many ways the current field of complex systems is an intellectual descendant of that movement. Some critics believe that complex systems will suffer a similar fate to Cybernetics: the inability to go beyond suggestive metaphors and disparate pieces to a more rigorous and useful framework. I'm personally more optimistic, but who knows... Complexity is still a young field with lots of potential for new and innovative developments, and I hope it will continue to attract some of the most creative young scientists around!

Author

Richard Snodgrass, an Associate Editor for Ubiquity, is a Professor of Computer Science at the University of Arizona, working in the areas of ergalics and temporal databases.

Footnotes

DOI: http://doi.acm.org/10.1145/1967045.1967047

©2011 ACM  $10.00

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2011 ACM, Inc.

COMMENTS

POST A COMMENT
Leave this field empty