Old economic models couldn't predict the recession. Time for new ones.

The US uses ‘Big Computing’ to analyze climate, healthcare, and even traffic – why not the economy?

|
Reuters/FILE
The exterior of the Federal Reserve building.

Part of a continuing series about complexity science by the Santa Fe Institute and The Christian Science Monitor, generously supported by Arizona State University.

The 2008 financial crisis – which cost the United States economy between $6 trillion to $14 trillion and the world economy a great deal more – shook the world of finance to its foundation.

It hit the most vulnerable particularly hard, as unemployment in the US doubled from 5 to 10 percent, and in several countries in southern Europe, 1 in 4 people who want a job are still unable to find work – roughly the unemployment rate the US experienced during the Great Depression.

It’s now old hat to point out that very few experts saw it coming. We shouldn’t be too hard on them, though. Surprisingly, the US investment in developing a better theoretical understanding of the economy is very small – around $50 million in annual funding from the National Science Foundation – or just 0.0005 percent of a $10 trillion crisis.

With the Eurozone Crisis still unfolding and financial panics now a regular occurrence on Wall Street, the next trillion-dollar meltdown might not be that far away. We justifiably spend billions trying to understand the weather, the climate, the oceans, and the polar regions. Why is the budget for basic research in economics, something that touches us all directly and daily, so paltry?

We think this has something to do with the way economics research is traditionally done, and we have a better way: loading millions of artificial households, firms, and people into a computer and watching what happens when they are allowed to interact.

Toy economies

Typical research in economics works with modest amounts of data, using laptop computers to solve ‘toy’ models of economies – simple models that are so abstract they bear little resemblance to reality.

These models are nothing like those used to predict global climate, to study galaxies, or to simulate brain activity – which crunch petabytes of data and require the largest computers available today.

Indeed, we are living through an era of computational exploration unparalleled in scientific history. Surely, we can use Big Data and Big Computing to better understand our troubled economy.

We need to respect the problem by treating the economy as a complex adaptive system – an evolving system comprised of many interacting components, or agents, whose collective behavior is difficult or impossible to predict by simply examining the behaviors of the individuals.

Simple caricatures

Currently, the Federal Reserve uses two kinds of models to study and build forecasts about the US economy. The first, statistical models, fit aggregate data such as gross domestic product, interest rates, and unemployment to past history to suggest what the near future holds.

The second type is known as “Dynamic Stochastic General Equilibrium” models. DSGE models are built from the best economic theory we have. They postulate that the economy would be at rest (in static equilibrium) if it wasn’t being randomly perturbed by events from outside the economy.

Both methods failed us in 2008. Statistical models fell apart in the Great Recession, as they always do when the economy ventures into unknown territory. DSGE models performed poorly before and during the 2008 crisis because they made unrealistic assumptions, such as infinite credit, and didn’t allow for surprises like banks defaulting.

These models are incredibly simple caricatures of the real world. Why? The economy is never in equilibrium, but is rather in a continual state of adaptive change.

In addition, the United States has some 115 million households, 150 million workers, and 30 million firms. These households are diverse, ranging from the poorest families to the people on the Forbes 400. The firms range from self-employed carpenters to Walmart, with some 4,000 stores and 1.5 million employees.

Current models can’t hope to capture the turbulence that arises when hundreds of millions of very different people interact with and react to one another in the economy.

We need an approach to economic modeling that embraces the complex interactions that take place in real economies. This can be done through a relatively new computational technology called agent-based modeling.

In an agent-based model, the key players in the economy are each explicitly represented in computer code. Each of these artificial people or agents makes economic decisions based on a set of rules, such as how much to work, how much to save each month, whether to buy a new car, or whether to invest in mutual funds.

In these artificial worlds, all the essential activities of the agents in an economy are simulated, and their interactions accumulate from the bottom up rather than being imposed – à la equilibrium theory – from the top down. Equilibrium may emerge or not. People may be well served by the economy or not. The important thing is that none of these outcomes are pre-supposed. The use of Big Data will ensure that the agents are accurately represented and the results are realistic.

Because all this is done on computers, there are no restrictions about how complicated the model can be or how many agents can be simulated. With enough computing power, in fact, we can potentially run such models at the level of one software agent for each firm and household in America – altogether hundreds of millions of agents.

Extreme events

We’ve already made considerable progress with agent-based models in economics. They provide the best explanations, for example, for the volatility of prices in financial markets, explaining why there are periods in which markets heat up and others when they cool down.

They also provide the best model for the prevalence of extreme events: why crashes occur more frequently than one would expect. We’ve built agent-based models of firm dynamics and labor movements between firms that explain dozens of features of the US economy.

In other fields, such as traffic analysis and public health, agent-based modeling has become the standard approach when policy-relevant solutions are needed. Agent-based traffic models offer planners detailed representations of the causes of congestion in cities. In Portland, Ore., for example, such modeling simulated traffic on every street in the city.

In economics, however, agent-based modeling is still the new kid on the block. We think it is time to give the complex systems approach a serious try. We need to build detailed agent-based models of economies.

Diversifying our theory portfolio

The type of model we envision can represent the behavior of the full variety of American households, from farmers to executives, retirees to recent college graduates. It can simulate the activities of a wide variety of firms, from large manufacturers to small businesses. It can include data on how the largest banks operate and, as a matter of course, alert regulators when the risks banks are taking rise to the systemic level, jeopardizing the operation of the whole financial system.

The goal of large-scale agent-based models is not to make point predictions: they do not say, for example, where the economy will be, exactly, in six months. Nor could they be used to trade on financial markets.

Rather, after a simulation is run a thousand or a million times, such models show policy makers the range of possible futures and their relative probabilities if policies remain unchanged. Then, by altering policies in the model and running them another million times, we might begin to understand paths to better futures.

Today we have outdated, 20th century models for managing an unpredictable 21st century economy. Only through quantitative models will economic policy disagreements have a chance to be settled via the scientific method, turning philosophical arguments into discussions of model parameters.

In the meantime, we remain vulnerable to trillion-dollar meltdowns. A reasonable strategy to avoid bad economic outcomes in our collective future involves broad investment in a diverse portfolio of economic models in hopes our policy makers can get better guidance than they had the last time.

Rob Axtell is a professor at George Mason University’s Krasnow Institute for Advanced Study, where he is chair of the Department of Computational Social Science. His research involves computational and mathematical modeling of social and economic processes. His book, “Dynamics of Firms: Data, Theories, and Models,” is due out from MIT Press next year.

J. Doyne Farmer is an external professor at the Santa Fe Institute and Director of Complexity Economics at the Institute for New Economic Thinking at the Oxford Martin School, as well as a professor of mathematics at the University of Oxford. His book on the need for complex systems models of the economy is due out from Random House next year.

You've read  of  free articles. Subscribe to continue.
Real news can be honest, hopeful, credible, constructive.
What is the Monitor difference? Tackling the tough headlines – with humanity. Listening to sources – with respect. Seeing the story that others are missing by reporting what so often gets overlooked: the values that connect us. That’s Monitor reporting – news that changes how you see the world.

Complexity, a partnership between The Christian Science Monitor and the Santa Fe Institute, generously supported by Arizona State University’s Global Security Initiative, seeks to illuminate the rules governing dynamic systems, from electrons to ecosystems to economies and beyond. An intensely multidisciplinary approach, complexity science draws from mathematics, physics, biology, information theory, the social sciences, and even the humanities to seek out the common processes that pervade seemingly disparate phenomena, always with an eye toward solving humanity's most intractable problems.

Complexity, a partnership between The Christian Science Monitor and the Santa Fe Institute, generously supported by Arizona State University’s Global Security Initiative, seeks to illuminate the rules governing dynamic systems, from electrons to ecosystems to economies and beyond. An intensely multidisciplinary approach, complexity science draws from mathematics, physics, biology, information theory, the social sciences, and even the humanities to seek out the common processes that pervade seemingly disparate phenomena, always with an eye toward solving humanity's most intractable problems.

Dear Reader,

About a year ago, I happened upon this statement about the Monitor in the Harvard Business Review – under the charming heading of “do things that don’t interest you”:

“Many things that end up” being meaningful, writes social scientist Joseph Grenny, “have come from conference workshops, articles, or online videos that began as a chore and ended with an insight. My work in Kenya, for example, was heavily influenced by a Christian Science Monitor article I had forced myself to read 10 years earlier. Sometimes, we call things ‘boring’ simply because they lie outside the box we are currently in.”

If you were to come up with a punchline to a joke about the Monitor, that would probably be it. We’re seen as being global, fair, insightful, and perhaps a bit too earnest. We’re the bran muffin of journalism.

But you know what? We change lives. And I’m going to argue that we change lives precisely because we force open that too-small box that most human beings think they live in.

The Monitor is a peculiar little publication that’s hard for the world to figure out. We’re run by a church, but we’re not only for church members and we’re not about converting people. We’re known as being fair even as the world becomes as polarized as at any time since the newspaper’s founding in 1908.

We have a mission beyond circulation, we want to bridge divides. We’re about kicking down the door of thought everywhere and saying, “You are bigger and more capable than you realize. And we can prove it.”

If you’re looking for bran muffin journalism, you can subscribe to the Monitor for $15. You’ll get the Monitor Weekly magazine, the Monitor Daily email, and unlimited access to CSMonitor.com.

QR Code to Old economic models couldn't predict the recession. Time for new ones.
Read this article in
https://www.csmonitor.com/Science/Complexity/2015/1210/Old-economic-models-couldn-t-predict-the-recession.-Time-for-new-ones
QR Code to Subscription page
Start your subscription today
https://www.csmonitor.com/subscribe