Empathy is the ability to imagine yourself as another person. Specialists say most children express empathy by age 4 but probably feel it much earlier. Behavioral research shows that most animals – dolphins, primates, rodents – empathize when a family member is in pain.
When empathy is absent, cruelty reigns, selfishness is celebrated, majorities oppress minorities. When present, empathy recognizes that other people’s concerns are valid, encourages mercy in victory, and tempers extremism. Empathy is the crucial factor in a healthy society; it undergirds moderation – but only if it goes beyond the surge of pity that most people feel when someone else is in distress. Empathy requires practical implementation.
You can see the need for empathy all over the world – from politically polarized Washington, D.C., where President Obama has talked of an “empathy deficit,” to austerity-constrained Southern Europe, where the brilliant promise of young people is clouded by debt and stagnation. But the need for empathy is perhaps most vividly on display in the Middle East.
The unprecedented protests, revolutions, and uprisings of the past two years began, in fact, with a massive upswelling of empathy. Mohamed Bouazizi, a street vendor in Tunisia, felt so desperate that he set himself on fire. Millions of people in the Arab world saw their own voiceless, oppressed condition in Mr. Bouazizi’s plight. The tumult that followed began with a we’re-all-in-it-together spirit. Anything but the status quo would be better, most felt. Why not freedom? Why not democracy?
But two years on, the spirit of the Arab Spring has been overtaken by anger, violence, and suspicion. Opinion polls continue to show that large majorities throughout the Middle East say they want democracy. But that is where consensus ends, especially in Egypt. Some people want Western-style pluralism; some crave order; some support democracy only if it gives a central place to Islam; and some are no longer sure they want an Egypt at all.
The long-persecuted Muslim Brotherhood now holds power, but as Dan Murphy’s cover story explains, even people who voted for it in the three democratic elections it has now won are beginning to sour on it for using state power to tighten its grip on the country. Technically, democracy has come to Egypt. People have voted, and power has transferred out of the hands of the military – at least on the surface – and into the hands of the Muslim Brotherhood. But democracy is failing in reality because the newly empowered majority has been silencing dissent and disregarding the interests of minorities.
Nations coalesce around many things – history, language, religious heritage. They can be held together by force, as Egypt was from the time of the Pharaohs until the ouster of Hosni Mubarak. As one authoritarian ruler weakened, another rose up. Although that pattern can persist, it is ultimately unsustainable. Democracies are sustainable – but only if parties that win elections actively understand that they, too, can become unpopular, resented, and vulnerable to overthrow; that they may one day be the minority. Self-interest, in other words, is a very good reason to empathize.
The paradox is that Egyptians, more than any other people, have vivid reminders of the fleeting nature of power all around them – a surfeit of pyramids, temples, and statues marking Pharaonic immortality. Every monument to the great Ozymandiases of their day is now a colossal wreck. What has endured are the Egyptian people. It is out of their lives and experiences – not just the fortunate few, not just the newly empowered, but all Egyptians – that a functioning nation must be built.
John Yemma is editor of the Monitor. He can be reached at email@example.com.
The liberating promise of technology was neatly captured in the old IBM slogan “Machines should work; people should think.” Back in the day of gas station attendants, stenographers, and telephone operators, automation seemed like a no-brainer. Who wouldn’t want to see menial, repetitious, uncreative work replaced by a machine? Who wouldn’t want to swap a plow horse for an air-conditioned tractor?
Then brains were added to the automation agenda, first as simple mechanical programs, then computers, then complex algorithms – accelerated by advances in microchips, networks, robotics, hardware, and software. Now automation is displacing even people who think. A machine can do almost any task that falls into an observable pattern. If your work feels routine, you are a candidate for obsolescence by automation.
There will always be a place for artisanal craftsmanship. At least, I think so. At this point. Probably. But the steady, middle-class jobs that powered industrial countries through most of the 20th century are disappearing. The center of the workforce is being hollowed out. Most jobs of the future will be either at the high-skilled upper end or the low-skilled lower end.
When politicians lament the loss of middle-class jobs, they are right about the difficulties workers face and the need for retraining and buffering during this continuous and often painful wave of dislocation. But while outsourcing and a tepid economy are often blamed, the real story is inexorable automation: ever-smarter machines working their way through the workforce.
What does that mean if you’re a graduate looking for your first job, a mid-career professional worried about job security, or just someone who wants to do work that has real value? In a Monitor cover story, Laurent Belsie, who heads up our business and economy team, presents 10 ways to prepare for tomorrow’s job market.
At the Massachusetts Institute of Technology, Erik Brynjolfsson and Andrew McAfee have been studying the relative strengths of humans and machines. In 1997, they note, the world’s top chess master, Garry Kasparov, lost to IBM’s Deep Blue supercomputer. But that wasn’t the end of the line for human chess players. It turns out that human-machine partnerships are better than either chess masters or supercomputers. As Mr. Kasparov himself described this: “Weak human + machine + better process was superior to a strong computer alone and, more remarkably, superior to a strong human + machine + inferior process.”
Machines are good and are getting better, but humans excel at overview and improvisation. Low-skill jobs such as home health-care aides and high-skill jobs such as biomedical engineers are improved by information and tools. And human judgment improves tools and information. As Professors Brynjolfsson and McAfee write, “the key to winning the race is not to compete against machines but to compete with machines.”
A minor case in point: A vacuum-cleaning robot recently joined our household. It is amazing how smart it is. And isn’t. It diligently works the kitchen floor but needs human intervention in a tight spot. Happy to help. And, yes, little robot, message received about not leaving so many shoes and books lying around.
If there’s a future for machines and people, it’s working together.
At least, I think so.
John Yemma is editor of the Monitor. He can be reached at firstname.lastname@example.org.
This year marks the 20th anniversary of the launch of the Mosaic 1.0 Web browser. For all the revolutionary disruption caused by the Internet, it was Mosaic that turned the underlying technology into the world’s instantly available library, social crossroads, and e-commerce marketplace.
Mosaic was the work of a group of computer science students at the University of Illinois at Urbana-Champaign led by Marc Andreessen, who later went on to cofound Netscape and had a hand in Ning, Twitter, and Facebook. Riding on the shoulders of Tim Berners-Lee’s invention of the World Wide Web two years earlier – which, in turn, rode on the shoulders of the thousands of scientists, engineers, government planners, and businesspeople who built the underlying Internet – Mosaic democratized the digital life. It was simple and obvious. Two decades on, it doesn’t look very different from Chrome, Safari, or other modern browsers. Mosaic made the browser a household appliance.
The browser changed the Internet and is not going away anytime soon. But mobile applications are the interface of the moment. Apps exploded in use because of smart phones and tablets. As Chris Gaylord explains in a Monitor cover story, apps go browsers one better: They connect the digital and tactile worlds.
You can navigate with them – find restaurants, snap photos and record videos, share with friends. Everyone who uses a smart phone or tablet has a few go-to apps. My current favorites include Flipboard, Zite, and, of course, the Monitor Weekly app (admittedly, the first-generation app we have is buggy and clunky; a new and better version is on the way). I was briefly a passable talent with the Doodle Jump app. I’ve played Words With Friends. I was once household champion at Angry Birds.
My app interest has settled down as the novelty of my iPhone and iPad has worn off. I now find myself relying on apps for practical things – the alarm clock app every morning, the oven timer app most evenings. I used the compass app recently to orient a weather vane. But day in and day out, the app I use most (don’t yawn) is the flashlight. It has just enough light to help me spot the tennis ball my dogs lose every night under the kitchen counter.
The flashlight app is what any good interface should be – simple, obvious, and helpful in the real world.
* * *
While we’re on the subject of things digital: The inner workings of the Internet era are more complex than those of the Gutenberg era. Take, for instance, advertising, which is a key way we support the continuation of Monitor journalism.
Online advertising comes to CSMonitor.com in two distinct ways. One is through a direct sale by our advertising department. The rest flies in automatically from providers such as Google Ad Exchange. Our advertising staff checks off in advance categories we don’t want – alcohol, tobacco, pharmaceuticals, etc. Occasionally, an inappropriate ad slips through. We appreciate it when we hear from readers who see them (you can click the "About these ads" link below any ad on our site). And we quickly adjust our filters.
Some readers have complained about political ads. We hope you’ll be tolerant when these don’t fit your point of view. We monitor them to ensure they don’t engage in ad hominem attacks or fan the flames of intolerance.
By taking ads from both sides, we think we are providing balance. And we trust our readers to decide for themselves.
Nothing beats a first. Firsts go down in the history books. We honor them with gold medals, bouquets, and anthems. But firsts need seconds.
On Dec. 14, 1903, Wilbur Wright went airborne. For three seconds. Something promising happened, but was it really flying? The next attempt, on Dec. 17, piloted by his brother Orville, was only nine seconds longer. But it was proof that what happened on Dec. 14 was real. A photograph of that second try shows Wilbur running alongside the flying machine. And it was definitely flying. Dec. 17 is the date remembered for the first flight.
F. Scott Fitzgerald may have morosely declared that there are “no second acts in American life,” but second acts are crucial. They let us correct mistakes and try again. Winning all the marbles the first time might be beginner’s luck. Doing it a second time proves you are an ace. Besides, science requires that results be reproducible to be valid.
This is not a knock on first-place finishers. A winner deserves a victory lap and a shower of ticker tape. But second-placers are no slouches. With that in mind, here are a few people to consider first-place second-placers:
•Tenzing Norgay, the second person to summit Mt. Everest. The Nepalese climber was only a few feet behind Edmund Hillary in 1953, a triviality after ascending 29,000 feet. Both men supported each other. Neither would have made it alone.
•Buzz Aldrin, the second man on the moon. He actually spent more time on the lunar surface than Neil Armstrong and seemed to have more fun.
•Alfred Russel Wallace, who developed the theory of evolution at the same time as Charles Darwin. Darwin published first but acknowledged that Wallace had the same ideas, calling the theory “your own and my child.”
•Susan Boyle (this isn’t my speciality but trust me, I looked it up). The British singer was runner-up in “Britain’s Got Talent 2009” but has sold more than 10 million copies of her debut album. Anybody remember who won first place in that contest?
•McKayla Maroney, the American gymnast who won a silver medal in the London Olympics. Her fall cost her the gold. Her “McKayla is not impressed” expression became an Internet sensation. She embraced it with good humor and even got to make the face with President Obama.
In a Monitor cover story, Robert A. Lehrman looks at presidential second terms. The common notion is that Fitzgerald was right, that there’s a “second-term curse.” But consider that along with Watergate, Iran-contra, and the Lewinsky scandals came the US-China breakthrough, the thaw in the cold war, and the late-’90s economic boom. Besides, the judgment of history is constantly being revised. Thomas Jefferson was considered a failure at the end of his second term. Harry S. Truman ended his presidency with a historically low approval rating. Both are now remembered as great presidents.
Second-termers can be lame ducks. But freed from the need to campaign for reelection they have the opportunity to rise above partisanship. That alone would prove that the trust voters twice placed in a president was fully merited.
Second acts are crucial in American life. Seconds allow for improvement. And they prove that firsts weren’t flukes.
John Yemma is editor of the Monitor. He can be reached at email@example.com.
There are many moons in our solar system, but we think of ours as “the moon.” It waxes and wanes on schedule. It’s as reliable as the sun. The planets are different. They trace mysterious paths through the night sky, lining up, separating, shining brightly, disappearing.
And no planet has intrigued us like Mars – far enough away to be an enigma, but sometimes tantalizingly close. Mars has always straddled fiction and fact. Named for a Roman god, about Earth-size, possibly habitable (friend? foe?), it has, until recently, been just beyond the range of exploration.
As technology has advanced, so has our knowledge of Mars, but not always in myth-busting ways. In 1877, for instance, astronomer Giovanni Schiaparelli observed what he thought were channels on Mars – “canali” – which, mistranslated as “canals,” gave rise to theories about a Martian civilization capable of engineering and agriculture. That set off a boom, in books and movies about journeys to Mars and close encounters with Martians. If nothing else, Mars fiction whetted our appetites for exploration. Science, after all, starts with speculation, though it can’t end there. As the philosopher of science Roger Bacon put it, the “strongest arguments prove nothing so long as the conclusions are not verified by experience.”
If, like me, you were a kid at the start of the space race, you’ll recall the Soviet Union throwing probe after probe at the planet in the early 1960s. All failed. NASA had its flubs as well. So many missions to Mars broke down that scientists talked of a “Mars curse.” There wasn’t one, of course, but sending a vessel to the red planet has long tested the limits of engineering.
It wasn’t until 1964 that Mariner 4 got close enough to take black-and-white photos of a crater-pocked surface and transmit a smattering of data about the planet’s atmosphere and surface temperature. Those first images were more a proof that we’d tagged the planet than a trove of new knowledge.
In a Monitor cover story, Pete Spotts assesses where we are in our short but increasingly productive encounter with Mars. A great stride occurred last year when the Mars Curiosity rover safely landed in Gale Crater and started its work. Curiosity promises unprecedented information about Martian chemistry and perhaps Martian life. More than that, however, Curiosity is giving us a National Geographic-type image bank, linking this other world to ours in high definition.
Our ancestors drew maps of the heavens and spun fables about what was out there. Soothsayers, astrologers, and philosophers named celestial bodies for mythological gods, attributed powers to stars and constellations, and tried to reconcile the cosmos with theology. Mystery still exists beyond the range of radio telescopes. But step by step, Mars is shedding its cloak of science fiction.
As our atlases and databases of the planet grow richer, Mars becomes part of our world. We’re increasingly there with our robotic extensions. Eventually, we’ll go there ourselves. And as the great science-fiction author Ray Bradbury observed, when we are on the planet, looking closely, we will inevitably see Martians.
“The Martians were there – in the canal – reflected in the water,” he wrote in “The Martian Chronicles” (1950). “The Martians stared back up at them for a long, long silent time from the rippling water....”
John Yemma is editor of the Monitor.
There's much more good news than bad news. But bad news travels fast and commands attention. Good news is like water carving a valley or a tree gradually extending its branches. Good news is a child learning a little more each day or a business quietly prospering. We hardly notice it.
Examine the data over time, and you'l find irrefutable evidence of progress: the decline of war and violent crime, the increase in life spans; the spread of literacy, democracy, and equal rights; the waning of privilege based on race, gender, heredity, beliefs (Jina Moore and a team of Monitor writers say this much more specifically in our cover story: "Progress Watch 2012").
Every so often there are vivid scenes of good news -- Neil Armstrong bouncing onto the moon, revelers atop the Berlin Wall, Nelson Mandela walking out of Robben Island prison. But most of the time good news is incremental, which causes it to be taken for granted.
Not bad news. When we hear it, we sit up and ask, "What just happened?" Bad news can make us beat our fists on the table and ask where was God and how can such a terrible thing happen. Bad news is mesmerizing. We can't look away from a collapsing high-rise or an inundated coastal town. We know the meaning of a sidewalk filled with flowers and teddy bears.
Bad news is insistent. In fairness, bad news isn't all bad. It can alert us to problems that need to be addressed. But in the grand scheme of things, there's actually not that much of it . Oh, there's always enough for a front page or a Web bulletin or a nightly newscast, although sometimes reporters have to travel to the ends of the earth to find it. Bad news has a natural advantage, however. It pulses through humanity's central nervous system -- word of mouth, the media, the Internet. Its images are riveting and its stories are dramatic. It floods the zone.
And when there's a shortage of bad news in the present, we can always turn to the future. Welcome to worry, dread, and pessimism. Sure, things seem OK now, but just over the horizon a disaster is brewing. Don't be a sap. Bad things are on the way.
They probably are. And they'll shock us and again make us wonder if life is out of control. But in this last issue of our news magazine for 2012, we're looking in the rearview mirror to see how things are going, and we're finding plenty of reason for hope.
Hope helps. It keeps us going in bleak times and amid disheartening news. But hope has much more credibility when we can point to the reason for it. Asserting that we should all cheer up is sweet. Knowing why is powerful.
Here are some reasons for hope: Extreme poverty is declining. HIV is no longer a death sentence. Technology is transforming everything from African agriculture to urban transportation. Drug violence is decreasing in Mexico. Travel is safer almost everywhere. Crime rates are falling. Somalia is emerging from a long night of anarchy. Myanmar (Burma) is coming out of its dictatorial shell. And while it's true that China and Russia are only semi-free and the Egypt and other post-dictator nations may be going down ill-considered paths, water is still carving the valley. Freedom lives in 7 billion hearts.
Bad news will make headlines in 2013. But good news will quietly rule.
John Yemma is editor of the Monitor. He can be reached at firstname.lastname@example.org.
Churches come in all shapes and sizes – a white-steepled Colonial nestled among Vermont maples; a cherub-packed basilica commanding a Roman boulevard; a megacampus hard by the interstate. A church can even spring up in a defunct Pizza Hut.
How and where people worship is constantly changing. Denominations may begin with a fervent few, rise to prominence, decline. Others reinvent themselves. And always there are new ones springing up. Churchgoing mirrors shifting populations and cultures. It’s like that old finger-play game of “Here is the church; here is the steeple.” Open the doors: Some churches are empty, some full.
Nowhere is denominational churn as pronounced as in New England. Five hundred years ago, religious refugees fled there, only to establish virtual theocracies. Later came Unitarians and other theists, gospel skeptics whose open-mindedness helped frame the US Constitution. Next up were the personal-savior preachers of the first and second “great awakenings” who fostered a populist Christianity. Then it was on to the transcendentalists with their celebration of nature and community.
Today’s New England is still a religious incubator. While it’s in the forefront of the “unchurched” trend – the growing numbers who see themselves as spiritually-minded but not denominational – New England is also seeing a mushrooming of nonmainstream churches. Jeff MacDonald’s cover story in the Monitor Weekly locates that creative burst in a desire for hands-on, make-a-difference faith.
I’m sure it hasn’t escaped your notice that this publication is sponsored by a New England-born denomination that, like many, has both thriving branches and shrinking ones. I asked Margaret Rogers, one of five members of the board of directors of the First Church of Christ, Scientist, about the ebb and flow of religiousness over the decades. She observed that children often push away from the familiar and traditional as they grow up but embrace those same qualities as they mature.
Through it all, she said, there remains “a natural yearning that never goes away,” a hunger for communion with something beyond ourselves and for community with others seeking the same thing. Perhaps that’s why the founder of the Christian Science church, Mary Baker Eddy, saw church in deeper terms than just a building or congregation, describing it as the “structure of Truth and Love.”
In every era, churches change, but not their essential purpose. That’s important, because when we least expect it, we can suddenly wonder why we’re here and where we’re going. Church can give us a way to work that out. The great novelist of faith and family, Marilynne Robinson, in her nonfiction book “Absence of Mind,” describes that sudden startling thought about our purpose in life as originating in the “haunting I who wakes us in the night.” That “I,” she notes, is surprisingly close to the biblical name for God: “I AM.”
That “I” can shake us awake in a soaring cathedral or call quietly in an abandoned Pizza Hut. We can hear it when mowing the lawn on a summer’s day or feel it in a crowd of shoppers on a sparkly Christmas Eve.
And after we wake up?
Maybe church – not the building, but the essence of church – helps us understand what to do next.
John Yemma is editor of The Christian Science Monitor. He can be reached at email@example.com.
All of us go to court at some point, if only to fight a traffic ticket or do jury duty. In court, you can see an attorney eloquently pressing a point, a judge wisely guiding the judicial process, and “the people” rendering carefully considered justice.
That’s in the ideal world, the world of “Law & Order” and “Perry Mason.” Courthouses can also be dispiriting sinkholes where day after day lawyers, clerks, police, judges, parole officers, and social workers try to keep their heads up amid the wreckage wrought by violence, drugs, selfishness, and chronically bad choices. In a courthouse, humanity can too easily be reduced to perpetrators and victims. You can feel guilty just being there.
That’s why it is all the more tragic when someone innocent is sucked into “the system” and becomes its victim.
In a Monitor cover story, Katy Reckdahl examines one particular problem with the judicial system: pretrial detention. An arrest can occur for any number of reasons – from suspicion of involvement in a major crime to an infraction that edges just over the line from misdemeanor to felony. If an individual cannot raise the bail money, he or she can get stuck in jail.
That wait can stretch for weeks or months as the case inches through a congested courthouse. In effect, imprisonment is taking place before the accused gets a fair trial.
There are half a million people in the United States in this predicament. Many of them, if eventually convicted, will already have served so much time in jail before their trials that they will not have to serve prison time. The tragedy is the innocent or those accused of small-time crime who get snarled in the system.
It is tempting to believe that an arrest is tantamount to a conviction, or to see bail as a form of punishment. Where there’s smoke, we often think, there’s fire. And there indeed are dangerous people who need to be detained to keep them from potentially harming others. But to keep someone locked up for an extended period simply because he or she cannot raise bail undermines the constitutional guarantee of a “speedy and public trial.”
Even when jail time is warranted, it seldom improves lives (see our May 21 cover story on the struggle inmates face after being released). At best, jail is an opportunity to change a life for the better. More often, the opposite occurs. That adds to the problem of pretrial detention the possibility that marginally bad behavior will become worse behind bars.
In the great prison movie “The Shawshank Redemption” (1994), the protagonist, Andy Dufresne, memorably remarks: “The funny thing is – on the outside, I was an honest man, straight as an arrow. I had to come to prison to be a crook.”
There are sincere people in and around the US criminal-justice system working on this. As you’ll see in our cover story, one promising approach is the supervised release of an arrested person after proper screening. As in most issues involving criminal justice, solutions aren’t always clear-cut. But the essence of the pretrial detention problem comes down to this comment from a specialist in Washington, D.C.: “Dangerous people get out of jail, and people who are not dangerous but don’t have the money stay in jail.”
That doesn’t seem right. And it doesn’t seem beyond us to figure out how to reserve jail for the convicted.
John Yemma is editor of the Monitor.
The “great man” theory of history was appealing in its simplicity. It promised that you would understand the world if you focused on several dozen individuals – a few kings, generals, and warlords; a scrum of statesmen and scholars; a handful of rebels and scientists. Napoleon and Martin Luther were two examples historian Thomas Carlyle had in mind when he developed the theory.
There’s no doubt that a small number of remarkable individ-uals ride the flood tide of history to fame and fortune. But that’s not enough to understand the world. The society into which one of these larger-than-life characters is born and the culture and history he or she inherits are of huge importance. Leaders don’t spring from the soil, sociologist Herbert Spencer argued. They have to be understood along with the people they lead and the world in which they operate.
So when you read Scott Peterson’s profile of Ayatollah Ali Khamenei (click here), the man designated as “God’s deputy on earth” by Iran’s political/religious establishment, you also need to know about Iranians; the Shiite faith; the cultural provenance of Persia; and the grievances, violence, and revolutionary experimentation of his nation’s past three decades.
That’s what Scott brings to the table. He has made 30 visits to Iran since 1996, seen hopes for a freer society rise and fall, made dozens of friends, and worked tirelessly to try to understand this rich, turbulent 2,500-year-old culture. His 2010 book, “Let the Swords Encircle Me: Iran – A Journey Behind the Headlines,” provides a multilayered perspective of a complex population and a culture that is both repressive and accommodating.
“For three decades, powerful forces have stood in tension with each other,” Scott writes in his book, “the religious hard-liners against the secular moderates; those who demand isolation against those who yearn for contact with the West. The result has been a destructive imbalance in Iran’s ‘sacred’ political system.... What for some Iranians is a dated, irrelevant governing philosophy holding the country back in political, economic and cultural seclusion is for True Believers still the only one that counts.”
Iranians, like Americans, can be inconsistent. They can love their country, dislike their leaders, and categorically reject foreign criticism of either. Even those deeply opposed to theocracy have their own vision of change, Scott says: “They want to grasp freedom for themselves and wage with their own hands the internal battle that will define what that freedom means.”
So what should we make of Mr. Khamenei, the once-timid cleric who is Iran’s supreme power broker? As you’ll see in Scott’s cover story, Khamenei has his reasons for distrusting the United States, Israel, and secular society. But he is not a madman. He is educated and well read, loves poetry and music, and lives modestly. He is not as schooled or as revered as his famed predecessor, Ayatollah Khomeini. He may believe the Islamic republic is destined for epic conflict with nonbelievers. Or he may just be trying to hold the current system together. In short, he is Iran’s spiritual and political leader and a quintessential product of a complex and contradictory society.
Understanding Khamenei is necessary, but not sufficient, to understand Iran. There’s no better guide to the man or the country than Scott Peterson.
John Yemma is editor of the Monitor.
Well before Columbus or Magellan or Lewis and Clark; before Asian hunter-gatherers crossed the Bering Strait; for as long as people have explored, the world has pulled back its curtain and revealed its bounty.
Expecting another untouched valley or unfished river over the horizon had a profound effect. The American historian Frederick Jackson Turner described the frontier mentality as creating confidence in “a new field of opportunity, a gate of escape from the bondage of the past.”
That confidence that there were more earthly paradises to discover liberated millions from poverty and repression, but often at the price of environmental carelessness. A few pioneers left a small imprint, but they were followed by settlers and developers whose practices were often more cavalier. If there was always a newer New World to find and enjoy, why bother protecting the old one?
By the 20th century, the frontier had faded. Except for a few remote jungles and ocean depths, most of the globe is known, claimed, and cataloged. The frontier mentality remains a source of optimistic expectation, but it is evolving in interesting ways.
Over the past half century, exploration has shifted to the scientific realm and to places beyond our home planet. It’s not that we discover fewer things; it’s that our discoveries have less to do with geography. The pace of discovery has actually accelerated in the age of robotics and knowledge networks. The real action is in smarter ways of managing what we already know about.
You can see that in everything from energy to waste to water. New extraction techniques have vastly increased the productivity of oil and gas deposits. Load management is making the electric grid more efficient. Better transportation and marketing sends food produced on one side of the world to the other. And, as William Wheeler notes in our cover story, innovative ways of managing water resources will be crucial in slaking the thirsty Earth’s population of 6.9 billion and counting.
The magic of water is also what makes it a problem: It falls from the sky. Because it is free, it seems to have no value. In wet climates, water is an afterthought and can be a nuisance (ask the people of coastal New York and New Jersey after superstorm Sandy). But if you live in an arid region like inland Australia, northern Africa, Central Asia, or the Great Basin in the United States, you know that water conservation is increasingly important.
Valuing water the way we value oil, say some resource specialists, may help stop its waste and spoilage.
Back in the 1970s, the French marine explorer Jacques Cous-teau used to send out fundraising letters with this salutation: “Dear Citizen of the Water Planet.” It was a corny opening line, but it got the point across. We’ve sent probes into space and landed on other planets. So far, nothing compares with our watery, blue-green marble floating in the void. So far, there are no new worlds more attractive than our old one.
That is not to say that we’ve got Earth’s many resource challenges licked. But rather than the age-old pattern of discovering, exploiting, abusing, and discarding Earth’s bounty, people everywhere are learning how to value and protect it.
The water planet is home. Managing its resources in smarter ways is our new frontier.