Uncertain Times
By Jessica Flack, Melanie Mitchell
The pandemic is an unprecedented opportunity—seeing human society as a complex system opens a better future for us all.
The pandemic is an unprecedented opportunity—seeing human society as a complex system opens a better future for us all.
We’re at a unique moment in the 200,000 years or so that Homo sapiens have walked the Earth. For the first time in that long history, humans are capable of coordinating on a global scale, using fine-grained data on individual behavior, to design robust and adaptable social systems. The pandemic of 2019–2020 has brought home this potential.
Never before has there been a collective, empirically informed response of the magnitude that COVID-19 has demanded. Yes, the response has been ambivalent, uneven, and chaotic—we are fumbling in low light, but it’s the low light of dawn.
At this historical juncture, we might acknowledge and exploit the fact that we live in a complex system—a system with many interacting agents, whose collective behavior is usually hard to predict. Understanding the key properties of complex systems can help us clarify and deal with many new and existing global challenges, from pandemics to poverty and ecological collapse.
In complex systems, the last thing that happened is almost never informative about what’s coming next. The world is always changing—partly because of factors outside our control and partly because of our own interventions. The linear thinking of simple cause-effect reasoning, to which the human mind can default, is not a good policy tool. Instead, living in a complex system requires us to embrace and even harness uncertainty. Instead of attempting to narrowly forecast and control outcomes, we need to design systems that are robust and adaptable enough to weather a wide range of possible futures.
Think of hundreds of fireflies flashing together on a summer’s evening. How does that happen? A firefly’s decision to flash is thought to depend on the flashing of its neighbors. In her book Patterns of Culture (1934), anthropologist Ruth Benedict argues that each part of a social system depends on its other parts in circuitous ways. Not only are such systems nonlinear—the whole is more than the sum of the parts—but the behavior of the parts themselves depends on the behavior of the whole.
Dr. Juerg Alean / Science Photo Library
Like swarms of fireflies, all human societies are collective and coupled. Collective, meaning it is our combined behavior that gives rise to society-wide effects. Coupled, in that our perceptions and behavior depend on the perceptions and behavior of others, and on the social and economic structures we collectively build. As consumers, we note a shortage of toilet paper at the supermarket, so we hoard it, and then milk, eggs, and flour, too. We see our neighbors wearing masks, so put on a mask as well. Traders in markets panic upon perceiving a downward trend, follow the herd, and end up causing the precipitous drop they fear.
These examples capture how the collective results of our actions feed back, in both virtuous and vicious circles, to affect the system in its entirety—reinforcing or changing the patterns we initially perceived, often in nonobvious ways. For instance, some coronavirus contact-tracing apps can inform users of the locations of infected persons so they can be avoided. This kind of coupling between local behavior and society-wide information is appealing because it seems to simplify decision-making for busy individuals. Yet we know from many years of work, from researchers such as Cornell mathematician Steven Strogatz, on swarming and synchronicity—think of the flashing fireflies—that the dynamics of coupled systems can be surprising.
A recent study by Jitesh Jhawar of the Indian Institute of Science and his colleagues, published in Nature Physics, found transitions to orderly states such as schooling in fish (all fish swimming in the same direction) can be caused, paradoxically, by randomness, or “noise,” feeding back on itself. That is, a misalignment among the fish causes further misalignment, eventually inducing a transition to schooling. Most of us wouldn’t guess noise can produce predictable behavior. The result invites us to consider how technology such as contact-tracing apps, although informing us locally, might negatively affect our collective movement. If each of us changes our behavior to avoid the infected, we might generate a collective pattern we had aimed to avoid: higher levels of interaction between the infected and susceptible, or high levels of interaction among the asymptomatic.
Complex systems also suffer from a special vulnerability to events that don’t follow a normal distribution or “bell curve.” When events are distributed normally, most outcomes are familiar and don’t seem particularly striking. Height is a good example: It’s pretty unusual for a man to be more than 7 feet tall; most adults are between 5 and 6 feet, and there is no known person more than 9 feet tall. But in collective settings where contagion shapes behavior— a run on the banks, a scramble to buy toilet paper—the probability distributions for possible events are often heavy-tailed: There is a much higher probability of extreme events, such as a stock market crash or a massive surge in infections. These events are still unlikely, but they occur more frequently and are larger than would be expected under normal distributions.
What’s more, once a rare but hugely significant “tail” event takes place, it raises the probability of further tail events. We might call them second-order tail events; they include stock market gyrations after a big fall, and earthquake aftershocks. The initial probability of second-order tail events is so tiny it’s almost impossible to calculate—but once a first-order tail event occurs, the rules change, and the probability of a second-order tail event increases.
An inability to predict the future doesn’t preclude the possibility of security and quality of life.
The dynamics of tail events are complicated by the fact that they result from cascades of other unlikely events. When COVID-19 first struck, the stock market suffered stunning losses followed by an equally stunning recovery. Some of these dynamics are potentially attributable to former sports bettors, with no sports to bet on, entering the market as speculators rather than investors, as reported in the Wall Street Journal on June 12. The arrival of these new players might have increased inefficiencies, and allowed savvy long-term investors to gain an edge over bettors with different goals. In a different context, we might eventually see the explosive growth of Black Lives Matter protests in 2020 as an example of a third-order tail event, precipitated by the killing of George Floyd, but primed by a virus that disproportionately affected the Black community in the United States, a recession, a lockdown, and widespread frustration with a void of political leadership. Statistician and former financier Nassim Nicholas Taleb has argued that such third-order tail events, called black swans, can have a disproportionate role in how history plays out—perhaps in part because of their magnitude, and in part because their improbability means we are rarely prepared to handle them.
One reason a first-order tail event can induce further tail events is that it changes the perceived costs of our actions, which changes the rules that we play by. This game-change is an example of another key complex systems concept: nonstationarity. A canonical example of nonstationarity is adaptation, as illustrated by the “arms race” involved in the coevolution of hosts and parasites. Like the Red Queen and Alice in Alice’s Adventures in Wonderland, parasite and host each have to “run” faster, just to keep up with the novel solutions the other one presents as they battle it out in evolutionary time. (See “The Evolutionary Potential of Pathogens. ) Learning changes an agent’s behavior, which in turn changes the behavior of the system.
Wikimedia Commons / Peter Corbett / CC BY 2.0
Another type of nonstationarity relates to a concept we call information flux. The system might not be changing, but the amount of information we have about it is. Although learning concerns the way we use the information available, information flux relates to the quality of the data we use to learn. At the beginning of the pandemic, for example, there was a dramatic range of estimates of the asymptomatic transmission rate. This variation partly came from learning how to make a good model of the COVID-19 contagion, but it was also because of information flux caused by the fact that viruses spread, and so early on, only a small number of people are infected. This situation makes for sparse data on the numbers of asymptomatic and symptomatic individuals, not to mention the number of people exposed. Early on, noise in the data tends to overwhelm the signal, making learning very difficult indeed.
These forms of nonstationarity mean biological and social systems will be “out of equilibrium,” as it’s called in the physics and complex systems literature. One of the biggest hazards of living in an out-of-equilibrium system is that even interventions informed by data and modeling can have unintended consequences. Consider government efforts to enforce social distancing to flatten the COVID-19 infection curve. Although social distancing has been crucial in slowing the infection rate and helping to avoid overwhelming hospitals, the strategy has created a slew of second- and third-order biological, sociological, and economic effects.
Do the properties of complex systems mean prediction and control are hopeless enterprises? They certainly make prediction hard, and favor scenario planning for multiple eventualities instead of forecasting the most likely ones. But an inability to predict the future doesn’t preclude the possibility of security and quality of life. Nature, after all, is full of collective, coupled systems with the same properties of nonlinearity and nonstationarity. We might therefore look to the way biological systems cope, adapt, and even thrive under such conditions.
Before we turn to nature, it is worth remembering that our species has been attempting to engineer social and ecological outcomes since the onset of cultural history. That can work well when the engineering is iterative, “bottom up,” and takes place over a long time. But many such interventions have been impotent or, worse, disastrous. Fiascoes happen with “backward looking” predictions, with a narrow focus on the last “bad” event that leaves us vulnerable to perceptual blindness. Take how the United States responded to the terrorist attacks of September 11, 2001, by investing heavily in terrorism prevention, at the expense of other problems such as health care, education, and global poverty. Likewise, during the COVID-19 crisis, a deluge of commentators has stressed investment in health care as the key issue. Health care is neglected and important, as the pandemic has made clear—but to put it at the center of our efforts is to again be controlled by the past.
Avalon/Photoshot License /Alamy Stock Photo
Instead of prioritizing outcomes based on the last bad thing that happened—applying laser focus to terrorism, or putting vast resources into health care—we might take inspiration from complex systems in nature and design processes that foster adaptability and robustness for a range of scenarios that could come to pass.
This approach has been called emergent engineering. It’s profoundly different from traditional engineering, which is dominated by forecasting, trying to control the behavior of a system, and designing it to achieve specific outcomes. By contrast, emergent engineering embraces uncertainty as a fact of life that’s potentially constructive.
When applied to society-wide challenges, emergent engineering yields a different kind of problem-solving. Under a policy of constructive uncertainty, for example, individuals might be guaranteed a high minimum quality of life, but wouldn’t be guaranteed social structures or institutions in any particular form. Instead, economic, social, and other systems would be designed so that they can switch states fluidly, as context demands. This setup would require a careful balancing act between questions of what’s good and right on the one hand—fairness, equality, equal opportunity—and a commitment to robustness and adaptability on the other. It is a provocative proposal, and experimenting with it, even on a relatively small scale as in health care or financial market design, will require wading through a quagmire of philosophical, ethical, and technical issues. Yet nature’s success suggests it has potential.
Frans de Waal
Nature keeps things working with two broad classes of strategy. The first ensures that a system will continue to function in the face of disturbances or “perturbations”; the second enables a system to reduce uncertainty but allow for change, by letting processes proceed at different timescales.
The first strategy relies on what are known as robustness mechanisms. They allow systems to continue to operate smoothly even when perturbations damage key components. For example, gene expression patterns are said to be robust if they do not vary in the face of environmental or genetic perturbations such as mutations. There are many mechanisms that make this invariance possible, and much debate about how they work, but we can simplify here to give the basic idea. One example, which has been researched by Andreas Wagner at the University of Zurich and his colleagues, is shadow enhancers: partially redundant DNA sequences that regulate genes and work together to keep gene expression stable when a mutation occurs. Another example is gene duplication in which genes have a backup copy with partial functional overlap. Robustness mechanisms can be challenging to build in both natural and engineered systems, because their utility isn’t obvious until something goes wrong. They require anticipating the character of rare but damaging perturbations. Nature nonetheless has discovered a rich repertoire of robustness mechanisms.
Nature has another set of tricks up her sleeve. The timescales on which a system’s processes run have critical consequences for its ability to predict and adapt to the future. Prediction is easier when things change slowly—but if things change too slowly, it becomes hard to innovate and respond to change. To solve this paradox, nature builds systems that operate on multiple timescales. Genes change relatively slowly but gene expression is fast. The outcomes of fights in a monkey group change daily but their power structure takes months or years to change. Fast timescales— monkey fights—have more uncertainty, and consequently provide a mechanism for social mobility. Meanwhile, slow timescales—power structures—provide consistency and predictability, allowing individuals to figure out the regularities and develop appropriate strategies.
The degree of timescale separation between fast and slow dynamics matters, too. If there’s a big separation and the power structure changes very slowly, no amount of fight-winning will get a young monkey to the top—even if that monkey, as it gained experience, became a really gifted fighter. A big separation means it will take a long time for “real” information at the individual level—for example, that the young monkey has become a good fighter—to be reflected in the power structure. Hence, if the power structure changes too slowly, although it might guard against meaningless changes at the individual level, it won’t be informative about regularities—about who can actually successfully use force when things, such as the ability of our young monkey, really do change.
Albert Yuralaits/Alamy Stock Photo
Furthermore, sometimes the environment requires the system as a whole to innovate, but sometimes it demands quiescence. That means there’s a benefit to being able to adjust the degree of timescale separation between the fast and slow processes, depending on whether it’s useful for a change at the “bottom” to be felt at the “top.”
The detailed mechanisms by which nature accomplishes timescale separation are still largely unknown and an active area of scientific investigation. However, humans can still take inspiration from the timescale separation idea. When we design systems of the future, we could build in mechanisms that enable users—such as market engineers and policy makers—to tune the degree of timescale separation or coupling between individual behavior on the one hand, and institutions or aggregate variables such as stock returns or time in elected office on the other. We have crude versions of this already. Financial markets are vulnerable to crashes because of an inherent lack of timescale separation between trading and stock market indices, such that it’s possible in periods of panic-selling for an index to lose substantial value in a matter of hours. In recognition of this property, market engineers introduced what’s called a “circuit breaker”—a rule for pausing trading when signs of a massive drop are detected. The circuit breaker doesn’t really tune the separation between trades and index performance, though. It simply halts trading when a crash seems likely. A more explicit tuning approach would be to slow down trading during dangerous periods by limiting the magnitude or frequency of trades in a given window, and to allow trading to proceed at will when the environment is more predictable.
Stock market crashes are a bridge to another of nature’s fascinating properties: the presence of tipping points or critical points, as they’re called in physics. When a system “sits” near a critical point, a small shock can cause a big shift. Sometimes, this means a shift into a new state—a group of fish shoaling (weakly aligned) detects a predator (the shock) and switches to a school formation (highly aligned), which is good for speedy swimming and confusing the predator. These tipping points are often presented in popular articles as something to avoid, for example, when it comes to climate change. But, in fact, as the predator example illustrates, sitting near a critical point can allow a system to adapt appropriately if the environment changes.
The COVID-19 pandemic provides an unprecedented opportunity to begin to think through how we might harness collective behavior and uncertainty to shape a better future for us all.
As with timescale separation, tipping points can be useful design features—if distance from them can be modulated. For example, in a study coauthored by one of us (Flack) of a large, captive monkey society, it was found that the social system was near a critical point such that a small rise in agitation—perhaps caused by a hot afternoon—could set off a cascade of aggression that would nudge the group from a peaceful state into one in which everyone is fighting. In this group there happened to be powerful individuals who policed conflict, breaking up fights impartially. By increasing or decreasing their frequency of intervention, these individuals could be tuning the group’s sensitivity to perturbations—how far the aggression cascades travel—and thereby tuning distance from the critical point.
We still don’t know how widespread this sort of tuning is in biological systems. But like degree of timescale separation, it’s something we can build into human systems to make them more fluid and adaptive—and therefore better able to respond to volatility and shocks. In the case of health care, that might mean having the financial and technological capacity to build and dismantle temporary treatment facilities at a moment’s notice, perhaps using 3D-printed equipment and biodegradable or reusable materials. In the economy, market corrections that burst bubbles before they get too large serve this function to some extent—they dissipate energy that has built up within the system, but keep the cascade small enough that the market isn’t forced into a crash.
Climate-change activists warning about tipping points are right to worry. Problems arise when the distance from the critical point can’t be tuned, when individuals make errors (such as incorrectly thinking a predator is present), and when there’s no resilience in the system—that is, no way back to an adaptive state after a system has been disturbed. Irreversible perturbations can lead to complete reconfigurations or total system failure. Reconfiguration might be necessary if the environment has changed, but it will likely involve a costly transition: The system will need time and resources to find solutions to the new environment. When the world is moderately or very noisy—filled with random, uninformative events— sensitivity to perturbations is dangerous. But it’s useful when a strategic shift is warranted (such as when a predator appears) or when the environment is fundamentally changing and the old tactics simply won’t do.
One of the many challenges in designing systems that flourish under uncertainty is how to improve the quality of information available in the system. We are not perfect information processors. We make mistakes and have a partial, incomplete view of the world. The same is true of markets, as the American investor Bill Miller has pointed out. This lack of individual omniscience can have positive and negative effects. From the system’s point of view, many windows on the world affords multiple independent (or semi-independent) assessments of the environment that provide a form of “collective intelligence.” However, each individual would also like a complete view, and so is motivated to copy, share, and steal information from others. Copying and observation can facilitate individual learning, but at the same time tend to reduce the independence and diversity that’s valuable for the group as a whole. A commonly cited example is the so-called herd mentality of traders who, in seeing others sell, panic and sell their own shares.
For emergent engineering to succeed, we need to develop a better understanding of what makes a group intelligent. What we do know is there seem to be two phases or parts of the process—the accumulation phase, in which individuals collect information about how the world works, and the aggregation phase, in which that information is pooled. We also know that if individuals are bad at collecting good information—if they misinterpret data because of their own biases or are overconfident in their assessments—an aggregation mechanism can compensate.
Sylvia Buchholz/REUTERS/Alamy Stock Photo
One example of an aggregation mechanism is Google’s original PageRank algorithm. PageRank gave more weight in search results to those pages that had many incoming connections from other web pages. Another kind of aggregation mechanism might discount votes of individuals who are prone to come to the same conclusion because they use the same reasoning process, thereby undermining diversity. Or take the U.S. electoral college, which was originally conceived to “correct” the popular vote so that population-dense areas didn’t entirely control election outcomes. If, on the other hand, implementing or identifying good aggregation mechanisms is hard—there are, for example, many good arguments against the electoral college—it might be possible to compensate by investing in improving the information-accumulation capacity of individuals. That way, common cognitive biases such as overconfidence, anchoring, and loss aversion are less likely. That said, in thinking through how to design aggregation algorithms that optimize for collective intelligence, ethical issues concerning privacy and fairness also present themselves.
Rather than attempt to precisely predict the future, we have tried to make the case for designing systems that favor robustness and adaptability— systems that can be creative and responsive when faced with an array of possible scenarios. The COVID-19 pandemic provides an unprecedented opportunity to begin to think through how we might harness collective behavior and uncertainty to shape a better future for us all. The most important term in this essay is not “chaotic,” “complex,” “black swan,” “nonequilibrium,” or “second-order effect.” It’s “dawn.”
Standing in a quiet, burned-out homesite overlooking the coastal town of Santa Barbara, California, six years after flames tore through this community in 2009, the sense of both terror and loss were still palpable.
This article has been adapted from one that appeared in Aeon, aeon.co.
Click "American Scientist" to access home page
American Scientist Comments and Discussion
To discuss our articles or comment on them, please share them and tag American Scientist on social media platforms. Here are links to our profiles on Twitter, Facebook, and LinkedIn.
If we re-share your post, we will moderate comments/discussion following our comments policy.