acm - an acm publication


On the emerging future of complexity sciences

Ubiquity, Volume 2006 Issue March | BY Kemal A. Delic , Ralph Dum 


Full citation in the ACM Digital Library

Complexity as a phenomenon is omnipresent in natural, social, business, artificial, engineered or hybrid systems. Cells, organisms, the ecosystem, companies, supply networks, markets, societies, governments, cities, regions, countries, large scale software and hardware systems, the Internet, all are examples of complex systems. Despite this omnipresence there is no commonly accepted, crisp and robust definition or classification of complex systems and one might ask why we would expect commonalities among such systems despite their obvious differences.

Figure 1. Complexity omnipresent in all areas

Broadly speaking, complex systems consist of a large number of heterogeneous highly interacting components (parts, agents, humans etc.). These interactions result in highly non-linear behavior and these systems often evolve, adapt, and exhibit learning behaviors.

Emergence and unpredictability are hallmarks of such types of systems. Emergence is in essence the acknowledgment that systems as diverse a economies, cells, or ant colonies cannot be characterized by the behaviour of their individual components - humans, chemicals, ants - but only by the higher level organisations that grow out of them. One consequence is that such systems will be hard to control microscopically by using rules of simple deduction or linear reasoning.

Complexity research tries to identify general principles of emerging organisations common to such systems across diverse areas, to understand the organizational structure of these systems in a coherent, possibly compact and rigorous way, and ultimately to simulate and optimize their behaviors.

Two Schools of Thought on Complexity

There is traditionally a dichotomy between the natural sciences - based on measurements and analytics - and engineering - based on making things work. Scientists look into complex systems across a variety of disciplines and problem areas trying to understand and elucidate general characteristics and concepts common to such systems, while engineers build and design working artificial complex systems. They deal practically with them, but often based on crude assumptions about their characteristics. This has led to different schools of thought that have their own language and priorities: one looks into complexity as an emerging phenomenon to be understood, while the other looks into complexity as an engineering problem to be tackled.

Recently, however, we see a convergence of these two schools. Increasingly, scientists study man-made systems in an attempt to apply models and concepts to artificial systems while engineers increasingly try to model the systems they build in an attempt to make them amenable to more detailed analysis.

Complexity research mainly happens at the borders between various disciplines and thrives on interactions between engineering and the sciences creating thus unique but still fragile bridges. Indeed, the most common trigger of complexity is the encounter of natural/living systems with artificial, man-made systems. Our guess here is that ideas born out of complexity research will culminate in hundreds of practical applications articulating slowly some major technologies on which mature businesses will thrive beyond the next decade or so.

Historical Prologue to Present Research

To set the stage for guessing about the future, we start from the short chronological, anecdotal and subjective history of complex system research. It started approximately as the war-related work on large scale system optimizations and intensive simulations in nuclear research. Practical needs and problems evolved into academic work engaging some of the most brilliant scientist of that time.

The first academic paper on complexity sciences could be the one of [Weaver48] pointing at different stages of research spanning 350 years starting from 'simplicity', over 'disorganized' to 'organized complexity'. Low-dimensionality problems (one or two variables) of classic mechanics represent the "simplicity" era from which several sciences have been born and technologies deployed. Statistical mechanics covers "disorganized complexity" in which high-dimensionality problems (two billion variables) are captured within a probabilistic framework. Finally, "organized complexity" represents a middle region, often occurring in a hybrid of living and man-made systems. Aspects of control, communication and adaptation in various systems have been discussed early by [Ashby, Wiener 1956/1961].

Later, Herb Simon, Nobel Prize winner for economics has published "The Architecture of Complexity" [Simon62]. He points at hierarchy as the distinctive structural feature of complex systems and at the property of "near decomposability" simplifying the description of complex systems. P.W. Anderson in his famous paper 'More is different' [Anderson72] emphasizes the role of emergence of global properties that cannot be directly deduced form microscopic properties of components. M. Gell-Mann contributed to the development via his paper 'What is complexity' [Gell-Mann, 1995]. Murray Gell-Mann and P.W Anderson were cofounders of the Santa Fe institute [Santa Fe'84] in 1984 together with the economist Kenneth Arrow. This was certainly a milestone in the development of a science of complexity. One branch of complexity research deals with cellular automata as model and paradigm of complex system behaviors [Wolfram 1988]. Closely related to this is Kolmogorov's quantitative measure of complexity, expressed as the length of the shortest (effective) description of object or phenomena.

Axelrod's work on onset of collaboration in Prisoner's dilemma type of games is a first hint at the usefulness of a complexity approach to business and economy (see "Harnessing Complexity" [Axelrod, 2000]). Brian Arthur [Arthur, 1999] developed a theory of innovation based on positive feedback that grasps the nonlinearity of business cycles and was a radical change from (equilibrium based) classical economics.

The Ascend of Complexity Research in the IT era

The rise of the Internet started an accrued interest in complexity research with the emergence of very large-scale information infrastructures and service systems satisfying the diverse needs of a large number of users (best known are Google, EBay, Amazon, and Yahoo) and entangling technology, business, society and us, as users, citizens, and customers. Tomorrow's technologies and businesses will therefore have to be built on a deep understanding of the interaction between these different components and on a system view of the whole.

Figure 2. An Instance of Business Complexity

The dramatic drop in cost/performance ratios brings PetaFlops, PetaByte, PetaBits computing, storage and networking into reach of even small start-up companies (Grid on-demand services, desktop clusters, Virginia grid etc.). These systems are capturing, creating, digesting and managing torrents of data and enable us to study profoundly behaviors of business and even social systems from the micro to macro level [Huberman 1999]. Also in science, large scale experiments (see eg. the `Human Genome project') experiments are now creating more data than we can realistically digest and interpret. Creation and calibration of accurate models will represent a major challenge where methods from complex systems research will play a major role.

All this has led to a surge in funding by the public research bodies, several private companies and academic institution. The European commission [FET03,] and several national funding agencies set up funding programmes and centers of excellence that are thriving in Europe (for instance ISI foundation in Turin and Collegium Budapest as the most established ones). Another sign of surge in interest is that the second European conference on Complex System (ECCS) [ECSS05] in Paris attracted nearly 500 people.

Likely Possible Future Developments

Exploratory research often starts by trying to cover as much territory as possible (as all early explorers and pioneers). This leads to a risk of vagueness and lack of rigor and focus, but research will be narrowed and better focused as first results identify the most promising threads.

Figure 3. : Meta Model: Roadmap of the Future

In a next step creation of a critical mass of publications, increasing filing of the patents, and presence of venture capital (VCs) will help research enter maturity by allowing benchmarking of results and by further stimulating long-term research into general concepts in complex systems.

When compared with the time necessary to develop a new set of technologies from exploratory research to mature major business, we estimate that about seven years is the average duration of each characteristic phase (see figure 3) based on observation of the necessary time to see PCs, the Internet and mobile phones to mature from invention to ubiquity. This brings us back to 1984 when the Santa Fe Institute was founded as the triggering milestone of articulated interest in complexity research. It may mean that we might expect some future technology to be mature by around 10 years from now. This meta model of the future is based on the observed cyclical nature of several technologies and is correct in principal and inaccurate in details. Thus, we may err about the duration of certain phases or the number of principal technologies developed, but in the long-run this model will fit reality.

We can liken the current complexity research to the past developments in Artificial Intelligence (AI). Someone compared the work of AI researchers to work of medieval alchemist trying to turn mud into gold in the process producing many useful chemical devices and advancing chemical processes while never succeeding in their original goal. Indeed, AI was preceded by a century long quest for mechanical life (musical and mechanical automata) provoking huge debates and controversies about its impossible objective (to imitate human intelligence). While still aiming at 'the impossible', AI researchers have created a variety of technologies humming in some critical applications (credit ratings, medical diagnosis, embedded in car and industrial robotic systems - as few examples). Some of the most advanced technologies today originated in AI research and, curiously enough, several Chief Scientists and Technologists of major new-wave companies have a deep AI background.


Complexity research will never become a single, encompassing theory-of-everything, or an independent discipline. It will thrive at the border between disciplines and in particular by interacting with engineering (thus approaching the 'science of the artificial' that Herbert Simon was promoting) and it will surely create several seed technologies.

By applying the above meta model, we believe that we are approaching the practical applications phase, as more and more companies apply ideas from complex systems and small companies have been born out of complex systems research (in particular from the Santa Fe Institute). While the initial focus of applications was in the financial services and bioinformatics as logical choices, there are more areas where complex systems research could have an impact. A well documented history or benchmarking of these first applications and companies grown out of the CS research could serve as instructive reading for subsequent (even more successful) tries.

In our view, we need (at least) two essential ingredients for the success of CS research: (1) a fertile environment, which will attract several brilliant minds from the wide variety of disciplines and (2) multidisciplinary approaches to science and engineering. As an example, we point to the overall regional, cultural and economic context which made Silicon Valley, Route 128, famous for creating a very particular climate at the origin of the blossoming trillion dollar industries that thrive on reinvesting into further research.

This is somehow reminiscent of the Renaissance in Tuscany that has created an incredible volume of science, art and wealth. From that period, Leonardo da Vinci was probably the last universal mind able to span multiple skills and disciplines as a scientist, artist and engineer. As a careful observer and interpreter of natural phenomena he concluded five centuries ago that all what we know, we have learned from the Nature [Note 2]. This is probably the best advice for the future generation of scientists who will face the great challenges of complexity.


[Weaver] - 1948 - Science and Complexity

[Ashby/Wiener] - 1956/1961, - Ross W. Ashby, Introduction to Cybernetics , Norbert Wiener, Cybernetics: Or control and Communication in the Animal and the Machine

[Simon] - 1962 - The Architecture of Complexity: Hierarchic Systems

[Anderson] - 1972 - More is different, Science 177, 393 (1972).

[SantaFe] - 1984 -

[Wolfram] - 1988 - Complex Systems Theory,

[Gell Mann] - 1995 - What is Complexity?

[Huberman/Adamic] - 1999 - Growth Dynamics of the World-Wide Web, Nature

[Brian Arthur] - 1999 - Complexity and the Economy, Science

[Axelrod] - 2000 - Harnessing Complexity, Basic Books, 2000

[ECSS05] - 2005 - European Complex Systems Society, ECSC


1. A. Complexity is what stays after we exhaust all possible simplifications. B. Complexity == order of which we ignore the underlying mechanism. Complexity is an order that exists in absence of a blueprint, the mechanisms leading to this type of order are collaborative C. "complexity science" looks for the simple causes of complex behaviors D. "complexity research" investigates complexity as apparent simplicity

2. Those who took other inspiration than from Nature, master of masters, were laboring in vain. [Leonardo da Vinci - 1500] - Quegli che pigliavano per altore altro che la natura maestra de' maestri s'affaticavano invano. LEONARDO DA VINCI, Trattato della pittura, 1500 c.

About the Author
Kemal Delic, is a lab scientist with Hewlett-Packard's R&D operations and a senior enterprise architect. His main interests are in enterprise architecture, analytics technologies and innovation.

Ralph Dum, is a physicist with main interests in cold matter physics, quantum computing, and in complex systems research. He is working as programme manager on future and emerging technologies at the European .

Source: Ubiquity Volume 7 Issue 10 March 14 - March 20, 2006)


Leave this field empty