What happens when the systems we rely on go haywire?

Can we learn to predict and control the systems essential to our survival?

Henny Ray Abrams/AP/File
Mathias Roberts, right, of Bank of America Merrill Lynch, and William Bott of Barclays Capital work on the floor of the New York Stock Exchange on the one year anniversary of the May 6 2010 "flash crash,' in New York.

Army ants can form a 60-foot-wide, 3-foot-deep front that moves through the forest like a bulldozer blade composed of millions of ants seeking prey. This behavior, essential to the colony's survival, is not directed by some central authority. The coordination emerges out of each ant’s simple programmed responses to chemical signals.

At times, though, the colony's behavior can go awry. If, by chance, the marching ants happen to circle back upon themselves, they will follow one another in a circular mill, each ant dutifully obeying signals as they collectively march themselves to death.

The science of complex systems is the study of how local interactions can lead to global consequences. Army ants illustrate both the great promise and peril of systems exhibiting emergent behavior.

Well-functioning complex systems are essential to our collective survival. Consider the markets that provide us with everything from food to energy to entertainment to foreign trade to insurance and more. Markets are driven by the actions of countless individuals, each reacting to his or her whims, the weather, and the news of the day. Remarkably, just as with army ants, out of all our individual actions emerges a higher order, a set of prices that allows us to buy and sell whatever we may desire.

Each price contains a vast amount of information. The price of a gallon of gasoline, for example, incorporates everything from the weather in a far-off port to the stability of a foreign government. It tracks refinery availability in the Gulf of Mexico and emissions regulations in California, while simultaneously taking into account the increased demand related to our summer vacation plans. 

But where do prices come from?

Prices arise from the countless acts of many individuals attempting to trade in various goods. They are an emergent phenomenon.

Economists have recognized the magic of the market for hundreds of years. Even self-interested traders can, together, bring about the efficient allocation of the world's goods and services, and the use of markets has been a remarkable engine of our survival and success as a cooperative species. It’s one of the best examples of the great promise of complex systems to improve our world.

There are well-known circumstances where markets fail, of course. Markets can be monopolized, leading to too-high prices. They can produce byproducts, such as the exploitation of finite natural resources, that harm individuals not involved in the immediate trade. Markets also tend to emphasize efficiency rather than equity. These failures can usually be corrected with appropriate public policy. 

But there is also the potential for systemic failure, akin to the circular mill of army ants, tied to the complex nature of markets. What happens when emergence goes bad?  

On May 6, 2010, around 2:30 p.m. Eastern time, key US equity indices plummeted for no apparent reason. The chaos in the indices initiated a tsunami that quickly began to wash over other markets. Stocks of formerly robust, mainstream companies began to trade at absurd prices. A share of Accenture, just minutes before trading at $40, could be had for a penny. A share of Apple shot from $250 to $100,000.

The event, which lasted for about a half hour, is now known as the flash crash.

The proximate cause of the flash crash appears to have been a set of trades initiated by a money-management firm with an address in Shawnee Mission, Kansas. On that day, the firm wanted to sell a large block of shares.

Key to selling a large block of shares is to slowly mete them out to the market so that other investors can’t detect the glut and take advantage of the situation. Rather than assign a person to this task, the Kansas firm took a more modern approach and delegated the task to a computer program.

The devil is in the details of such programs, of course, and there was a devil indeed. The program, rather than relying on the current market price of the shares, kept selling as long as the firm’s shares constituted only a small part of the overall market volume. Under usual conditions, such a strategy would ensure that the market was liquid and prices were reasonable.

But markets have become highly connected and computerized, so what happens in one market doesn't stay in one market. Information now flows quickly through the system, and computerized trading algorithms can be programmed to execute trades in the blink of an eye. (Actually, eye blinks take a poky 350 milliseconds, which is slow compared to today’s algorithmic stock trades.)

If prices across markets are not perfectly aligned, an investor can buy the cheaper variant of a security in one market while simultaneously selling the more expensive version in another market, making a guaranteed profit.

These changes in trading technology have created a new kind of complex system, unforeseen even a decade ago.

At the start of the flash crash, shares began to enter the market. Algorithms monitoring the various markets started rapid cycles of buying and selling to each other, resulting in a sudden increase in market volume.

This is where the fatal flaw in the Kansas firm's program became apparent: this increase in market volume caused the firm’s algorithm to dump more shares onto the market, increasing the market volume even more, which in turn caused the algorithm to sell more. A positive feedback loop emerged and the original block of shares sold in less than twenty minutes. In the past, sales of similarly sized lots took at least six hours.

The complexity inherent in the market induced a human version of the circle of army ants.

The rapid influx of new shares into the market caused their prices to fall dramatically. This resulted in a misalignment of prices in other markets, and the chaos began to spread. The huge volumes of resulting transactions started to overwhelm the news feeds, which further exacerbated the crisis as both the machines and the humans in the loop could not make sense of the deluge of events.

The stock exchanges have built-in “circuit breakers” designed to halt trading when market conditions are such that the execution of additional trades would result in unnaturally large price swings. A five-second trading pause was imposed by one such mechanism 13 minutes into the flash crash.

In a world where nanoseconds rule, five seconds is an eternity, and it appears that this action was enough to nudge the system toward more normal behavior 30 minutes into the event.

The various stock exchanges involved in the flash crash recognized that the market conditions that prevailed during the event were not “fair and orderly” and that some of the prices that arose were “clearly erroneous.” The trades that took place during the event were reversed over the next few days. The Securities and Exchange Commission implemented new circuit breakers into the markets, though their design was driven far more by intuition than by scientific objectivity.

Crisis averted. Lessons learned. And yet we might not have learned enough. 

The 2010 flash crash was driven by ignorance and greed, but not malice. It is easy to imagine what could happen if malice and a bit more forethought were directed at disrupting our markets in similar ways.

We have created a complex, adaptive, and emergent financial system that we do not fully understand or know how to control. Each piece of this system makes sense: interconnecting markets allows arbitrage to keep prices in line, algorithmic trading ensures that there is always a willing trading partner, derivatives provide new means to hedge risk. 

While each piece makes sense, the collective often does not.

The study of complex systems suggests that knowing the behavior of each individual piece of a system does not give us insight into the behavior of the system as a whole. Implementing a circuit breaker in one market, for example, may resolve the immediate issue that market faces, yet shunt the problem to other markets. 

We may well be at a stage where we cannot fully grasp the implications of the financial system we have built. Most of the time, out of this complex system emerges an order that allows us to thrive in a complicated world. Yet, relying on such a system also entails the potential of something going horribly wrong. 

Our well-being now relies on the complex systems that bind our food supplies to our energy networks to our global climate to every institution in our society. The urbanization of our planet, the sustainability of our ecosystems, and the stability of our political systems are all entwined in complex, adaptive, interacting systems.

We get better as we go. We learn from every setback. But the feedback loops that will drive tomorrow’s crises are already embedded in our systems, concealed by their complexities, and we don’t know where they are or what might trigger them, much less how to control them. 

Complexity is an aspect of our world that is amenable to scientific analysis, understanding, and perhaps even control, though we are only beginning to make progress on these fronts. We find ourselves in a race for knowledge and control of the complex world around us, a race that we must win if we are to thrive, and perhaps even survive, as a species.

John H. Miller is a professor of economics and social science at Carnegie Mellon University and an external professor of the Santa Fe Institute. He is the author of the newly released book on complex systems: A Crude Look at the Whole (Basic Books).

You've read  of  free articles. Subscribe to continue.
Real news can be honest, hopeful, credible, constructive.
What is the Monitor difference? Tackling the tough headlines – with humanity. Listening to sources – with respect. Seeing the story that others are missing by reporting what so often gets overlooked: the values that connect us. That’s Monitor reporting – news that changes how you see the world.

Dear Reader,

About a year ago, I happened upon this statement about the Monitor in the Harvard Business Review – under the charming heading of “do things that don’t interest you”:

“Many things that end up” being meaningful, writes social scientist Joseph Grenny, “have come from conference workshops, articles, or online videos that began as a chore and ended with an insight. My work in Kenya, for example, was heavily influenced by a Christian Science Monitor article I had forced myself to read 10 years earlier. Sometimes, we call things ‘boring’ simply because they lie outside the box we are currently in.”

If you were to come up with a punchline to a joke about the Monitor, that would probably be it. We’re seen as being global, fair, insightful, and perhaps a bit too earnest. We’re the bran muffin of journalism.

But you know what? We change lives. And I’m going to argue that we change lives precisely because we force open that too-small box that most human beings think they live in.

The Monitor is a peculiar little publication that’s hard for the world to figure out. We’re run by a church, but we’re not only for church members and we’re not about converting people. We’re known as being fair even as the world becomes as polarized as at any time since the newspaper’s founding in 1908.

We have a mission beyond circulation, we want to bridge divides. We’re about kicking down the door of thought everywhere and saying, “You are bigger and more capable than you realize. And we can prove it.”

If you’re looking for bran muffin journalism, you can subscribe to the Monitor for $15. You’ll get the Monitor Weekly magazine, the Monitor Daily email, and unlimited access to CSMonitor.com.

QR Code to What happens when the systems we rely on go haywire?
Read this article in
QR Code to Subscription page
Start your subscription today