Funded Grants


Complexity and robustness

From almost every perspective, global interconnectivity is on the rise. Satellite networks and communication systems allow essentially instantaneous access to data and images worldwide, fluctuations in Tokyo's financial markets are felt in New York and London, and overpopulation, pollution, deforestation, and global warming are widely recognized as everybody's problem. Social, economic, technological, and environmental systems are simultaneously becoming increasingly entangled and interconnected with each other. This globalization offers potential benefits including a higher standard of living, increased access to information, more sophisticated healthcare, and economic incentives to protect environmentally sensitive areas. Simultaneously, it carries potentially catastrophic risks associated with cascading failures which may propagate rapidly when systems are strongly interdependent and densely connected. Examples include delays in transportation and communications systems, as well as errant policies or technological blunders that initiate a whirlwind of economic, political, and environmental consequences.

Of course, engineers and policy makers work extremely hard to design systems that will not break down, despite a great deal of uncertainty in the environment in which they operate and the components from which they are built. Few of us would get on an airplane if this were not the case. Simultaneously, we accept the fact that there are no guarantees. Similarly, biological evolution favors organisms which are tolerant to common variations in weather and nutrients. Selective pressure does little to protect advanced organisms against rare disturbances like large meteor impacts. The robustness architectures that stabilize systems and minimize propagation of damage are so dominant and pervasive that we often take them for granted. At their best, advanced technologies and organisms combine complicated internal networks and feedback loops with sloppy parts to create systems so robust as to create the illusion of very simple, reliable, and consistent behavior, apparently unperturbed by the environment. Nonetheless, failures occur. With increased connectivity there is the potential for increased social, economic, and environmental cost. Can anything be done to predict these events and mitigate their damage?

A key insight comes from the observation that robustness involves tradeoffs between a broad spectrum of environmental influences that may initiate cascading failure events. Indeed, robustness can be viewed as the underlying mechanism leading to complexity. Complexity is not simply the number of component parts, but also the heterogeneity of components, and their organization into intricate networks. Robustness architectures dominate genomes for most organisms, and distinct part counts for modern technologies, and provide opportunities for higher fitness and/or performance because of the stability they create. This is the essence of the Highly Optimized Tolerance (HOT) theoretical framework linking robustness and complexity which I introduced recently with my collaborator, John Doyle. HOT describes systems fine-tuned for high performance, despite uncertainties in the environment and components. HOT's essential features include highly specialized, structured, hierarchical internal configurations, and robust, yet fragile external behavior.

Fragilities arise when the very architectures that lead to robustness under one set of circumstances backfire leading to extreme sensitivity in other cases. One example is the automobile airbag, which protects passengers in high speed, head on collisions, but poses a danger to small children riding in the front seat, even under accidental low speed or stationary deployment. Another is the immune system which protects people from common colds and viruses, but occasionally backfires attacking an individual's own cells in auto-immune diseases. Such failure modes represent hypersensitivities which are intrinsically coupled to the robustness mechanism itself. While overall higher performance is achieved when the robustness architecture is included, under rare circumstances the outcome is worse than it would be if the architecture were not there at all. In many cases, this eventually leads to additional layers of complexity, as in the case of increasingly sophisticated airbag technologies involving sensors to estimate passenger size. Iteration of this process culminates in a complexity spiral, in which new features are developed to mitigate sensitivities associated with previous layers. Since robustness is created by very specific internal structures, when any of these systems is disassembled there is very little latitude to reassembly if a working system is expected. Even the rare cascading failure that is the fragile side of HOT complexity reveals only a limited glimpse of a system's internal architecture. Nonetheless, studying a system's robustness mechanisms provides a potential pathway to anticipating fragilities.

HOT was developed initially using the models of statistical physics, modified to include a primitive form of robust design. Imagine your goal is to design a toy forest on a checkerboard landscape, where occupied sites correspond to trees, and vacancies correspond to firebreaks. Occasionally, a spark hits the forest due, e.g., to lightening, striking some locations more frequently than others. If it hits a tree, the fire burns through a connected cluster of neighboring trees. If your goal is to maximize the surviving yield of trees, what is the optimal strategy? While this is not a realistic model of forest management, it serves to illustrate the basic tradeoff between maximizing functionality under ideal circumstances (represented by high tree densities), and the need to devote resources (space) to robustness architectures which protect the system from a spectrum of disturbances. HOT configurations consist of compact, high density, cellular patterns of contiguous trees, separated by efficient, linear firebreaks. Optimal patterns are much more robust to fires than random configurations at similar densities, but are also extremely sensitive to changes in the spatial distribution of sparks, or flaws in the firebreak patterns. Under many circumstances, the robust, yet fragile nature of HOT systems leads to heavy tails or power law statistics in the frequency vs. size distribution of failure events. HOT has been successful in quantitatively describing statistics of forest fires, World Wide Web traffic, and electrical power outages. Heavy tails reflect tradeoffs in systems characterized by high densities and high throughputs, where many internal variables are tuned to favor small losses in common events, at the expense of large losses when subject to rare or unexpected perturbations, even if the perturbations themselves are infinitesimal.

The research plan I developed in response to the James S. McDonnell Foundation's 21st Century Research Award for Studying Complex Systems extends the HOT framework linking complexity and robustness, and pursues applications in ecology, biology, engineering, and business. My objectives are loosely divided into four overlapping focus areas: (1) theoretical Foundations and unifying themes based on merging the perspectives of statistical physics and systems oriented mathematics from engineering, (2) applications in ecology and evolutionary biology aimed at exploring robustness in scenarios which are relevant to these fields, (3) development of models for robust network flow structure and evolution, with applications to the Internet, financial networks, and food webs, (4) detailed investigations of robustness, evolution, and human intervention in disturbance prone ecosystems, focusing on fires on terrestrial landscapes.

HOT is motivated by biology and engineering, and builds on the mathematics of control, communications, and computing. While physics focuses primarily on universal properties of generic collections of isolated systems, control theory studies specific, often highly stylized systems in terms of their input vs. output characteristics. Control theory provides a mathematical framework for describing systems that are coupled to other systems, identifying sensitivities, and systematically determining the important internal variables for a system with particular objectives immersed in a variable environment. Suitably generalized, these techniques will be powerful, although currently their consequences outside of the controls community are largely unexplored. HOT provides an appealing base for the development of a broad framework for characterizing complex systems. Questions related to robustness, predictability, verifiability, and evolvability arise in a wide range of disciplines, and demand sharper definitions and new tools for analysis. If complex systems are intrinsically composed of extremely heterogeneous collections of objects, which are combined into intricate, highly structured networks, with hierarchies, and multiple scales, then HOT provides a means to develop a common ground between models, methods, and abstractions developed in different domains.

Developing models of varying resolution, ranging from the tractable models on which the basic HOT framework is built, to complex, domain specific application models for technological, economic, ecological, and biological systems will connect our new tools with real world applications, and inspire new questions for theoretical consideration. The real test of a general framework for understanding complex systems is the extent to which it can provide new insights which generate future technologies, policies, medicines, etc.. This work has begun on several fronts and involves a spectrum of interdisciplinary collaborations. For the case of forest fires, with collaborators in geography I have developed a sophisticated new fire regime simulation environment to investigate fundamental properties of fire dynamics, and the long term effects of evolution and suppression on terrestrial landscapes. This work provides a fundamentally new perspective on the dynamics of disturbance-prone ecosystems and how humans interact with them. Findings will be of significance to science and management, as human disruption of natural disturbance regimes is widespread and increasing.