Five new technologies that will change the world (and win at Jeopardy!)
Five forms of new technology that can change the world: From the computer that beats humans on "Jeopardy!" to cellphone apps for African pick-and-hoe farmers, to satellites that spy on human rights abusers.
| San Francisco
Watson was an idiosyncratic "Jeopardy!" player. He wagered odd amounts of money – $2,127 – and sometimes his guesses were odder still, like when he named Toronto as an American city. Or when he named Dorothy Parker as the title of a famous writing manual (whereas Ms. Parker was, in fact, the person who wrote a review of the book in Esquire in 1959).
And yet Watson, an IBM computer, won his nationally televised game of "Jeopardy!" in February. He steadily overpowered his opponents, Ken Jennings and Brad Rutter – two of the game's all-time human champions.
In an age when computers have multiplied the productivity of workers, it is tempting for millions of people with monotonous office jobs to wonder whether Watson could outright replace them. It won't happen yet.
Between the lines of Watson's story and the half century of history that made him possible is a parable of innovation and economics with much to say about which technologies will have a broad impact on society. The most glamorous advances often didn't have that impact (supersonic air travel, for example), whereas pedestrian inventions like the Haber-Bosch process to produce nitrogen fertilizer fundamentally altered the economics of basic human need – and changed the face of the planet.
In this installment of the Future Focus series, the Monitor examines five technologies changing the world now, or well positioned to do so in the future: from low-tech gadgets remaking livelihoods in remote villages of Niger to electronics that roll out of printing presses to spacecraft hurtling around Earth at 5 miles per second. For the most part, these technologies promise one thing: to stretch the dollar – or yen, or euro – into accomplishing new things. It is this litmus test that will determine how pervasively Watson touches the lives of ordinary people.
Few people realize it, but behind Watson's cool veneer of digital competence were 2,880 computer processors filling 10 racks. They devoured an estimated 100,000 watts of electricity – 80 times what an average American home used in 2008. Watson's employer would pay $100,000 a year to power him – plus $30,000 in cooling to prevent him burning the building down.
This problem illustrates a little-appreciated fact, explains Rahul Sarpeshkar, an electrical engineer at the Massachusetts Institute of Technology in Cambridge: "Fundamentally, energy and information are deeply linked. You cannot process information without expending energy."
The knack for corralling electrons into an orderly dance of information has improved dramatically since transistors first appeared around 1950. The number of transistors on a chip doubled every two years – a trend called Moore's law. Those shrinking transistors cost less to manufacture and consumed less energy – making them cheaper to use. Today's transistors are smaller than a red blood cell, by a factor of 150, and consume less energy – by a factor of 50 billion per calculation – than the vacuum tubes in World War II-era computers.
Shrinking energy dissipation made it possible to use electronics in ever-expanding ways: hearing aids that wouldn't cook Grandma's ear; radios that could run for weeks on batteries; and the 1971 BusiCom desktop calculator that let computers compete cost-effectively with paper and pencil.
"All of this consumer electronics has only become possible because we've made these spectacular strides in energy efficiency," says Stephen Furber, a chip designer at the University of Manchester in England. Professor Furber's own contribution, the ARM processor chip that he helped design at Acorn Computers in the 1980s, reduced power consumption by a factor of 10, allowing millions of people to carry cellphones that don't incinerate their pocket lint or require hourly charging. In this day of on-the-run tweets, status updates, and cellphone photos, it could be argued that these phones and their low-power chips are responsible for making Facebook and Twitter the forces that they are today in pop culture, politics, and Arab revolutions.
But an impasse is near. Transistors are now crowded so tightly onto chips that their heat threatens to cook the computer: They emanate up to 300 watts of heat per square inch – five times more than an electric burner on high.
"For a long time we've been very fortunate," says John Maltabes, a visiting scholar at Hewlett-Packard with 30 years of experience in chip manufacturing. "What's happening now is economics are catching up."
Moore's law is slowing down, and it affects every computer, from the smallest to the largest. Scientists have relied, for example, on supercomputers to study particle physics and the brain – not to mention ensuring the safety of aging nuclear weapons. But grander scientific questions bring bigger price tags. Tianhe-1A, the world's fastest supercomputer at China's National Supercomputer Center in Tianjin, wolfs 4 million watts of electricity. The next-generation machine planned by the US Department of Energy will guzzle 25 million watts (the original blueprint was to consume 130 million watts, or $130 million of electricity per year, before downsizing to make its utility bills affordable).
"Computers are pretty much energy-limited now," concludes Furber.
The almost-intelligent software that allows Watson to win at "Jeopardy!" remains an impressive feat, and it will be used where the cost can be justified – like sifting through reams of financial information to suggest stock purchases at investment banks; searching stacks of legal documents to find the pearl of evidence that will win a court case; or tweaking the coming and going of planes, trains, and buses so that travelers don't miss connections. But before something as smart as Watson comes to ordinary peoples' laptops and smart phones, engineers must build more-efficient computers that circumvent current energy limits. Whichever technology succeeds will push the world to a different place.
The central insight from the past 45 years is that Moore's law was never a law of science, but a law of economics. It was the economic benefit of squeezing ever more transistors onto a chip that drove Moore's law forward. Likewise, the technologies that have changed the world often weren't the ones that allowed new and exciting things – but rather those that reduced the costs of doing things already possible – whether those costs are counted in dollars, time, or other finite resources.
Supersonic air travel – once a source of French national pride, which promised to bring the far corners of the globe closer together – failed to pass this test. Even before the last Concorde retired in 2003, supersonic travel had been defeated by the basic economics of air drag. "Just going from Mach 0.85 to Mach 1.1 doubles your fuel consumption," says Erik Conway, a technology historian at NASA's Jet Propulsion Laboratory in Pasadena, Calif.
In addition to killing fuel economy, air drag threatened to rip the plane apart – necessitating an expensive reinforced airframe. Air friction also heated the Concorde's exterior to 200 degrees F. – and was expected to heat the exterior of the US supersonic airliner under development to 1,000 degrees F. – complicating maintenance between flights. Concorde's thick fuselage fitted just 100 passengers; a round trip from New York to Paris could cost $8,000. Even as conventional, subsonic jets became three times more efficient, supersonic travel was never more than marginally profitable.
Human spaceflight suffered a similar fate: Homo sapiens reached the moon in 1969, but 42 years later, that remains a high-water mark rather than the start of a growth curve. Changing political priorities undermined space travel, but so did engineers' inability to reduce the fixed amount of food, water, and oxygen that humans need to survive.
"The mass required to support a human in space is still very large," says David Whalen, a technology historian at the University of North Dakota in Grand Forks. That mass, transported by decades-old rocket technology, still costs $2,000 to $10,000 per pound to push into orbit. Business executives ultimately found better, if less-celebrated, ways to use that expensive payload space. "You can put a million times more computer into orbit now" compared with 1969, points out Mr. Whalen. That, with improving solar cells and batteries, fueled a boom in unmanned satellites that has shrunk the world in ways that Concorde never could have.
But the advance with the biggest impact over the past 100 years may be the least glamorous. In a recent online debate, Vaclav Smil, a technology historian at the University of Manitoba in Canada, named the Haber-Bosch process, a chemical reaction almost unheard-of by the public, as the greatest advance of the 20th century. It combines hydrogen and nitrogen to make ammonium. It's used to produce 100 million tons of fertilizer per year – needed for feeding a third of Earth's 7 billion humans.
It was the finitude of farmland and fresh water that made Haber-Bosch matter. It can be argued that it is scarcities in general that set the stage for many world-changing technologies.
"Technologies that shift the fundamental resource base for the commodities of the 21st century are going to be important," agrees Erik Straser, general partner at the technology venture capital firm Mohr Davidow in Menlo Park, Calif. This almost certainly includes technologies that will alleviate bottlenecks in energy – especially oil and gas. It probably extends to ones that ease other emerging shortages, such as lithium and rare earth metals (important in batteries and electric motors).
Identifying all of the technologies destined to change the world would fill the pages of many dissertations. Many of those predictions would still be wrong. But, in the days that follow, the Monitor explores five that have a fighting chance.
GALLERY: FUTURE FOCUS: TECHNOLOGY