Skip to: Content
Skip to: Site Navigation
Skip to: Search


Upfront Blog

A young boy play s with a toy laser gun Near his family’s Nevada campsite at the Burning Man festival. (Jim Urquhart/Reuters/File)

Guns and freedom: the American paradox

By Editor / 03.12.12

Guns were designed with one purpose. They don’t cut rope, drive nails, or propel baseballs. They are built to kill or maim. Very much of what you read in the news revolves around guns. Firearms are central to almost every conflict, crime, uprising, or peacekeeping mission. Governments use guns to maintain order. Rebels use guns to challenge authority. 

There are more guns per capita in the United States than in any other nation. Some people like guns for hunting, target practice, and skeet shooting. If you have a gentle nature, though, you probably abhor firearms. Every time a confused young person opens fire at a school, a worker assaults co-workers, a stray bullet enters an inner-city home, or a spouse, passerby, political figure, or store clerk is attacked with a deadly weapon – every one of those violent acts is a powerful argument for locking guns away.

 If you believe in the right to self-defense and liberty, however, you probably see gun ownership as natural and just. Every assault, home invasion, robbery, or hostage taking is a powerful argument that a lawfully armed citizen might have been able to stop the crime in progress.

 The balance between public safety and individual freedom is not easy. From the Revolution through the settling of the frontier to the 21st century, guns have been embedded in the American experience. The Second Amendment to the Constitution appears to enshrine the right of individuals to own and carry firearms. In 2008 and 2010, the US Supreme Court issued landmark rulings that ensured possession of firearms for lawful purposes. Since then, a slew of state laws have expanded access to firearms and the freedom to carry them in public. 

 In the years to come, it appears, there will be even more guns in more hands in more places than ever before. As a Monitor special report notes, no one can say for sure if that will make society more, or less, safe. Scholars such as economist John R. Lott Jr. and criminologist Gary Kleck cite evidence that people who carry concealed weapons have stopped thousands of crimes from happening. But it’s hard to be sure about cause and effect. 

 The Brady Campaign to Prevent Gun Violence, meanwhile, argues that easy access to firearms is the reason that almost 100,000 people are shot or killed in the US each year. But do we blame the gun or the person?

 Another paradox: The spread of guns in the US is happening amid an overall decline in violent crime. Yet violent crime (more than 12,000 murders and 44,000 shooting attacks in 2008) is still far higher in the US than in other industrial countries.

 Guns and gun lore flow through our culture. Many of us grew up with toy revolvers and plastic rifles. We may have learned to shoot at summer camp (I did). We honor the embattled farmers who fired the shot heard round the world; the Old West sheriff whose quick draw dispatched the bad guys; the heroic soldiers, police officers, and law-abiding citizens who band together to defend themselves in a thousand movies. We cringe at gangsters, assassins, and bullies who brandish firearms to intimidate the innocent.

 But if you take your knowledge of guns from pop culture, you have an unrealistic view of what they can and can’t do. Guns are precise only in the hands of trained marksmen. Gun wounds are rarely something a good guy can shrug off. Silent silencers, impregnable bulletproof vests, and bottomless magazines for blazing away are Hollywood nonsense.

It is a shame that we are still a species that feels comfortable, even celebrates, an instrument built solely to maim or kill. We are, after all, the same species that believes in persuasion and reason and has seen the efficacy of nonviolent movements. Yet ending tyranny and oppression and defending life and liberty still seem to require firearms. 

 Martin Luther King Jr. once said that the power of nonviolence rests in the idea that “you not only refuse to shoot a man, but you refuse to hate him.” One day, let’s hope, we’ll see that that radical concept, which runs through all religions and cultures, is far more powerful than black powder and lead.

 John Yemma is the editor of The Christian Science Monitor.

Read entire post | Comments

Marine one, with President Obama aboard, takes off from the White House lawn. (Evan Vucci/AP)

Oval Office or Starship: What a leader needs

By Editor / 02.22.12

Who was the best captain of the Starship Enterprise: the swashbuckling James T. Kirk, who relied on charm, instinct, and, frequently, his fists; the cool-headed Jean-Luc Picard, who could outthink half the known universe; the compassionate, collaborative Kathryn Janeway, who enlisted her crew’s teamwork to solve intergalactic puzzles? 

Pardon the Nerd 101 quiz. I’m trying to make a point about management styles: There will never be universal agreement on what type is the most effective. Circumstances always differ. Almost any manager – even bizarre ones like Muammar Qaddafi or Dilbert’s pointy-haired boss – succeeds somewhere for some amount of time. 

Being a great chief executive officer may or may not make you a good president. Governors and generals have been presidents. So have lawyers, gentleman farmers, a teacher, a college president, a movie actor, a tailor, a haberdasher, and a community organizer. Why not a CEO?

IN PICTURES: CEOs and politics

No matter what kind of résumé a leader brings to the White House, governing requires firmness, compromise, compassion, cunning, and an ability to sort out the fast-breaking complexities of global and domestic politics, economics, culture, sociology, science, and several dozen other disciplines at lightning speed. It also helps to be a good communicator. And likable. And have a nice family. And a cute dog.

At 1600 Pennsylvania Avenue, management skills matter, but so does everything else. Nobody takes that job with a portfolio sufficient to cope with all the possibilities that may crop up. Good management is about knowing what you know and what you don’t know and working with your team, your allies, and sometimes your opponents to try to influence events so that they (1) do the least harm and (2) ideally, do the most good for the most people.

Since the pioneering days of Frederick Taylor, management has tried to be a science. The part that Mr. Taylor concentrated on – how workers perform their tasks most efficiently – can be scientifically analyzed and replicated. But management also includes the more holistic approaches advocated by Peter Drucker and a cast of thousands of gurus who tout their advice in the Harvard Business Review and on the racks of airport bookstores. 

Management is art as well as science. The art is to make it seem as though one’s charges are not being managed. There’s nothing so chortle-worthy as a tin-eared boss spouting jargon about Six Sigma, delayering, empowering, reengineering, or rightsizing. There’s nothing so transparent as when el jefe awkwardly pulls a “one-minute manager” or decides to “manage by walking around.” (Let’s see, it says on page 36 ...)

Not that management ideas don’t have value. It’s just that a good manager internalizes the lessons learned from management experts and makes them her or his own. Like everything else in life, good management starts and ends with integrity. Another way to put it is that “the strongest force in the universe is a human being living consistently with his identity.” (That would be management guru Tony Robbins.)

Speaking of the universe, here's a pop quiz on leadership: Which starship captain would be best in an alien encounter? 

For my part, I’d go with this lineup: Janeway to motivate the crew on the long journey through space; Picard to devise an enlightened protocol for first contact; and, of course, Kirk in case the local bad guys want to rumble.

John Yemma is the editor of The Christian Science Monitor. 

Read entire post | Comments

A girl arranged origami cranes in Valparaiso city, Chile, on Aug. 6, 2010, the 65th anniversary of the atomic bombing of Hiroshima, Japan. (Eliseo Fernandez/Reuters)

Is Iran rational enough for MAD?

By Editor / 02.17.12

The acronym said it all: MAD. From the late 1940s until the late 1980s, the unthinkable idea of “mutual assured destruction” was the centerpiece of the cold war. You may recall how it worked: Automatic retaliation in a nuclear war would be so destructive that both sides would lose everything. 

Nuclear weapons became a kind of “pagan god,” a Monitor editorial observed on the 40th anniversary of the bomb blast at Hiroshima. And “to appease this insatiable nuclear deity, more and deadly nuclear weapons are made and deployed – as if it must be fed and placated to keep it from unleashing nuclear wrath upon the people.” 

Was MAD crazy? If everything the human race ever does is rational, then the threat of Armageddon was off the charts wacko. But MAD arose in a century that had already produced two world wars, unprecedented genocide, saturation bombing, and any number of other shameful chapters. MAD was not especially odd in that lineup. Perverse as it seems, the assurance of mutual destruction in the second half of that century may have been the reason that Aug. 9, 1945, was the last time a nuclear weapon was detonated in an international conflict. 

MAD, in other words, may have been so crazy that it worked. Could it still? That question is at the heart of Scott Peterson’s examination of the threat posed by a possible Iranian A-bomb

A South Asian diplomat who has had firsthand dealings with Iranian leaders recently observed that Iranians have a fairly normal desire to assert themselves as a nation. They see India, Pakistan, and Israel with nuclear weapons and don’t accept that they should be blocked. Their isolation, which is growing under a tough international sanctions regime, intensifies their sense of both defensiveness and entitlement.

We don’t know if Iran will go nuclear, but it appears to be gaining the scientific and industrial ability to do so. Attempts to stop Iran’s nuclear program through stealth or outright attack would, at best, only delay it. So let’s think the unthinkable for a minute. Let’s think about Iran getting the Bomb.

If Iran’s leaders are rational, self-interest should keep them from doing anything rash, knowing they would face massive retaliation. But there is a troubling undercurrent in Iranian thinking. President Mahmoud Ahmadinejad and other Iranian leaders love to make apocalyptic threats, especially about the destruction of Israel. These are often wrapped into an eschatological vision involving the return of the Maadi, or 12th imam, of Shiite Islam, who is believed destined to wage an all-out holy war in which Islam will prevail. Iranian Shiism also glorifies martyrdom. Add the A-bomb and you can see why Israelis and many others are concerned that mutual assured destruction might not work with Iran.

No one can say whether Iran’s threats are a clear and present danger or just political theater. But one thing Iran has in common with all other countries is that it is made up of millions of people interested in living a good life, building businesses, and raising families. It would take a very mad leader to rain down destruction on all those lives in the hopes of proving a theological point. 

Similar concerns about irrational ideologies and dark intentions were present when the USSR and China got the Bomb. Nuclear war with Russia or China is not unthinkable today, but it is far-fetched, even though we still live in a world where the United States has 8,500 nuclear weapons, Russia has 11,000, and China and six other nations have hundreds more. While MAD was the first, crude effort to keep nuclear ambitions in check, diplomacy, cultural exchanges, and trade have worn away suspicions over time.

Although that pagan god is still being fed and placated in the Middle East today, the rest of the world has largely walked away from it.That Monitor editorial marking 40 years after Hiroshima said it best: “Mankind cannot for long be intimidated into peace.”

John Yemma is the editor of The Christian Science Monitor.

Read entire post | Comments

Some of collector Sue Wilson’s nearly 100 wedding-cake toppers, Norfolk, Va. (Stephen M. Katz/The Virginian-Pilot/AP)

Millennial generation: What's love got to do with it?

By Editor / 02.13.12

It is a truth universally acknowledged that a happily married man writing about courtship must be in need of a serious talking to. Nothing so lacks credibility, especially with younger  readers. Oh, sure, he may on occasion text his wife a sweet nothing using “r” as a verb to show how with-it he is. He may throw caution to the wind and attempt to dance “the robot” at a nephew’s wedding. But he almost always will put air quotes around such not-really-modern-anymore fads to signal his actual distance from them.

For those reasons and more, it is probably best for me to send you directly to Eilene Zimmerman's fascinating and thorough exploration of modern dating. 

Depending on your vintage and values, your view of 2012 courtship may range from quiet approval to mild alarm. There is, for instance, a definite standoffishness about marriage in the Millennial Generation (also known as Generation Y, meaning those born in the 1980s and early ’90s). But even if young people are waiting longer before saying their vows than any previous generation, they aren’t necessarily anti-marriage. What they most want, it appears, is to get marriage right.

“They’ve seen a lot of divorce in their parents’ generation,” Eilene told me the other day. “They’ve been through difficult holidays. But they’re really into family and marriage. They actually want to move to the suburbs and raise a family one day – just not now.”

Many have struggled through a bleak job market while carrying big college debt, so they are naturally cautious. Even the dreaded subject of sex is not what you may think in an age of ubiquitous contraception and noncommittal “hookups”:  Despite a casual attitude toward intimacy, risky behavior is not something this well-warned generation embraces. In some ways, says Eilene, Gen-Y is like the famed GI generation that fought in World War II and built the postwar world. By midcentury, in other words, this may be a fairly conservative cohort.

Eilene is an Gen-Xer. I’m a baby boomer. Talking about generations is unavoidable in trying to understand society. The drawback, of course, is that as with every other way humans categorize themselves – race, gender, religion, class – there are tremendous variations within the categories. Music, dress, slang, and hairstyles may characterize an age group, but those are superficialities. Individuals within Gen-Y are carving out distinct paths when it comes to tricky issues like intimate relations. That was true even of us boomers, not all of whom went to San Francisco with flowers in our hair.

In the not too distant future, Millennials themselves may wonder about the dating scene for the next demographic cohort, Generation Z (aka, Digital Natives). They, too, will have to use air quotes when it comes to current mores.

Society is a large and interesting blend of ages, communities, families, and individuals. But when it comes to affairs of the heart, most people are looking for the same thing: deep and abiding commitment. The StoryCorps project has assembled a collection of long-running love stories in a book titled “All There Is.” The collection echoes the testimonials from elderly couples that were sprinkled throughout the 1989 date movie “When Harry Met Sally” – how we met, what we mean to each other, the relationship two people build throughout a lifetime. One of the "All There Is" couples mentioned six nice things that one couple learned to say to each other throughout their marriage: You look great. Can I help? Let’s eat out. I was wrong. I am sorry. I love you.

From neolithic times to the digital age, those have been words to live by. 

So, Honey, if u r reading this, u look great. Happy Valentine’s Day. Let’s eat out tonight. I promise not to dance the robot.

John Yemma is the editor of The Christian Science Monitor. 

Read entire post | Comments

An employee checks halogen inserts for low-energy-consumption light bulbs at a factory in France. ( Vincent Kessler/Reuters)

Reinvention: The rewards of trying again

By Editor / 02.05.12

Who doesn’t love an invention? Think of the light bulb, which had never shined in all of history until Thomas Edison switched it on on Dec. 31, 1879. Think of lasers, helicopters, microchips, elegant equations like E = mc², and even modest wonders like batteries, Velcro, and air conditioning. We honor inventors, enrich them, ask them about the meaning of life.

Reinvention isn’t in that league. The tip-off is the “re,” meaning it’s been done before. If invention is the dazzling hit, reinvention usually begins as a miss. But a miss that is taken back into the workshop, rethought, reworked, and brought out for a second, third, or fourth try can change the world. The light bulb, for instance, only stayed lit after it hadn’t at least 6,000 times. Each time, Edison and his co-workers took stock of what went wrong, made improvements, and tried again. His invention was a serial reinvention.

For many workers in the wake of the 2007-09 recession, reinvention has not been by choice. Nor has it been easy. But here’s what re-invention always is: necessary. 

For one thing, economic survival depends on reinvention. Job security has all but vanished in most industries. Owning a home or shoveling money into a 401(k) is not the path to financial security it once was believed to be. That’s bad news. The good news is that so many people have experienced financial setbacks, layoffs, or job shifts in recent years that it is clear that failure is not about character flaws.

Failure can be an opportunity. Saying that, of course, is easier than living through it, especially if you’re out of work, in debt, or clinging to a job you don’t like because you need the money. But reinvented careers can be the ones in which people learn more about who they really are and make a go of something they really love. (You can see evidence of that in this special report in The Christian Science Monitor Weekly news magazine.)

The key seems to be to stop mourning the past, honestly assess your skills, envision where you want to be, gain new skills, and then go for it. You may not end up where you think you should, but you will be learning with every step. 

As Abraham Maslow, who developed the famed “hierarchy of needs” – a pyramid that everyone climbs from basic requirements like food and water to an apex of creativity and achievement – put it: “One’s only failure is failing to live up to one’s own possibilities.”

Besides, people who have seemed touched by greatness still have lives to live. Like athletes and musicians, most inventors are highly productive in their youth. At 25, Albert Einstein experienced a “year of wonders.” Working as a patent clerk, he achieved conceptual breakthroughs that revolutionized modern physics. Einstein led a long and useful life, and some of his most profound discoveries happened long after he was 25. But the sum of his brilliance during that one year was never repeated.

I once asked an octogenarian physicist about his year of wonders. Among other things Martin Deutsch, who taught at the Massachusetts Institute of Technology, had discovered positronium, an elemental form of matter. “In 1951 and ’52, I had a great creative outburst that I’ve never had again,” he told me. “It was a virtuoso performance.” 

His colleagues thought he would win the Nobel Prize. He was not bitter that he did not. Instead he reinvented himself as a teacher and was satisfied that “there are people who have become what they have become because of what I have taught them.” 

At the end of our conversation, he pointed out a spruce tree in the corner of his garden. He had rescued it when his neighbor threw out a bunch of potted plants. Day after day it had sat there, refusing to turn brown. “It wanted to live,” he said.

Martin Deutsch died in 2002. His students are pushing physics forward in the 21st century. And that spruce, which didn’t make it as a potted plant, flourished.

John Yemma is the editor of The Christian Science Monitor. To comment on this column or anything else in the Monitor, please e-mail editor@csmonitor.com. 

Read entire post | Comments

A man transports bags of animal feed via horse cart on a foggy day in Lahore, Pakistan. (Mohsin Raza/Reuters)

Green energy isn't always good energy

By Editor / 01.27.12

Solar, wind, hydro, and geothermal are widely considered benign energy sources. For the most part, they are. They harness nature without producing noxious emissions or significant waste streams. They don’t require strip mining, punching a hole in the seabed, fracturing bedrock, or splitting atoms. From sailboats to south-facing gardens, hot springs to millstreams, green energy’s friendly reputation predates hydrocarbon and fission by centuries.

But in their modern application, even these ancient energy sources have downsides. Most photovoltaic cells, for instance, contain nitrogen trifluoride, which the Scripps Institution of Oceanography says is a potent greenhouse gas when it escapes into the air. Solar cells also block sunlight from grass and flowers that otherwise would bask in it. When you dam a river, you constrict fish migrations and deprive alluvial plains of nutrients. Geothermal often means power plants atop scenic areas. And wind, the subject of this week’s cover story, needs enormous wind turbines. Birds and bats fly into them. Noise and visual pollution can be annoying.

Wait. I know what you are thinking: Green energy drawbacks are tiny compared with Chernobyl, Fukushima, the Exxon Valdez, the BP oil spill, and global warming. Absolutely right. But part of the reason the drawbacks are minor is that green energy is still a fraction of overall energy production. A few windmills on the Zuider Zee are as charming as tulips and wooden shoes. But when you erect acres of wind turbines, you’ve got a scale problem.

Consider the world before the internal combustion engine. Today, most people consider the automobile a Faustian bargain, a huge convenience that is nevertheless blamed for altering our landscape and atmosphere. In its first years, however, the horseless carriage was not just a technological marvel but an answer to a significant crisis. As the urban population of people exploded in the 19th century, so did the urban horse population. The effect on the environment, public safety, and public health was awful and heading for catastrophic, writes Eric Morris in an excellent 2007 article (you can read it here) in the University of California’s Transportation Center’s Access magazine: “One New York prognosticator of the 1890s concluded that by 1930 the horse droppings would rise to Manhattan’s third-story windows. A public health and sanitation crisis of almost unimaginable dimensions loomed.” And horses, which could rear up or bolt for no apparent reason, were even more dangerous per thousand people than automobiles. They were also exploited mercilessly in the grim economics of 19th-century cities. 

Black Beauty is a magnificent creature with an insignificant waste stream. Ten thousand are an environmental nightmare. Henry Ford helped solve that problem. But now we burn through so many hydrocarbons that we have a new environmental crisis.

Green energy is good energy. But it is not perfect. Think about wind turbines. Over the Christmas holidays, a 26-story-tall one popped up by the highway I take to work. It’s big – “War of the Worlds” big. That single turbine is a novelty. An army of them can become a huge controversy, as has happened off the picturesque south coast of Cape Cod, where a massive wind farm on Nantucket Sound is inching forward amid intense local opposition.

Now wind energy is exploding across the globe. In areas such as Mexico’s Isthmus of Tehuantapec, questions of exploitative development have accompanied the boom. That is likely to be the case in many parts of the developing world, which has all too frequently been despoiled to feed the energy and raw-materials needs of industrial nations. NIMBY issues and indigenous resistance are bound to multiply as fast as windmills, solar farms, and other green energy installations as the world races to diversify away from hydrocarbons to protect the climate and at the same time accommodate both the 7 billion people now on the planet and the 3 billion more that are likely to arrive by 2070.

We solved the horse problem with horsepower. Now we have a horsepower problem. Solving it presents a new set of problems. There’s always a job out there for a new problem solver.

John Yemma is the editor of The Christian Science Monitor. 

Read entire post | Comments

Alexandra Ventura, age 13, and her sister Susie, 8 (r.), play on a backyard trampoline in Santa Clara, Calif. (Tony Avelar/Special to The Christian Science Monitor)

Why play's the thing

By Editor / 01.23.12

Only a Scrooge would frown on child’s play. Or to be more modern: only a Severus Snape. Sure, children can goof off for a while. But the age at which they are required to put away childish things, straighten up and fly right, and master the Hogwarts curriculum keeps getting younger and younger.

Standardized testing, helicopter parenting, a society obsessed with good colleges and successful careers – there are plenty of reasons why time for make-believe and play-acting has been shrinking. In a new Monitor special report, Stephanie Hanes looks at overprogrammed childhood and the educators, parents, psychologists, and others who are trying to reverse the tide.

Stephanie is a veteran correspondent who has had demanding assignments for the Monitor in sub-Saharan Africa and other parts of the world. Her interest in child’s play – like her interest in the “Disney princess effect” (see our Sept. 26, 2011, cover) – was sharpened by her introduction less than a year ago to an important new journalistic source: Madeline Thuli Hanes Wilson.

Through the eyes of her daughter, Maddy, Stephanie says she has been seeing how modern childhood is too often torqued by commercialism and parental anxiety. “I was looking at the books that I’m reading to her and realized that, wow, so many of them are selling products,” she says. That’s not unlike the rampant merchandise tie-ins to girlhood that Stephanie reported on earlier. 

At 11 months, Maddy already has an extraordinary number of organized activities she can take part in. Her favorites? “When people are on the ground interacting with her,” her mom says.

Now, plenty of parents in developing countries would like the opportunity to expose their children to organized activities. And toys and games aren’t evil. They can make a kid feel enriched, boost skills, and familiarize youngsters with the technological world they are entering. But relentless scripting of child’s play has its drawbacks.

Free time and make-believe boost physical development, socialization, and – most important – the imagination. A huge amount of what we value as a civilization comes from the what-if side of us. While we must follow rules and recipes, train ourselves and test our skills, our artistic side needs time to wonder, improvise, and dream. Productive writers from Shakespeare to Charles Dickens, Dr. Seuss to J.K. Rowling, have coupled imagination with discipline. Wolfgang Amadeus Mozart talked of musical ideas emerging when he was alone, sometimes when he was sleepless or taking a walk after a good meal. To muse and mull and eventually hear a symphony in his head, he said, “is perhaps the best gift I have my Divine Maker to thank for.”

Mozart might have been the most overprogrammed child of the 18th century. Under his father’s tutelage, he was by the age of 5 adept at violin and keyboard, and composing and performing for European royalty.

 By today’s standards, he would have been locked and loaded for the Juilliard School since he was in diapers. 

It takes both imagination and discipline to produce works as original as “The Magic Flute.” That winning combination is true not just of literature, music, and painting but of science as well. The scientific method is meant to prove or disprove a hypothesis. But the hypothesis – the hunch, the what-if – has to come from somewhere. Angels must be entertained. 

The great thing is that play needs little in the way of investment or accessories. It just needs freedom to happen. Even at 11 months, Maddy has all sorts of play options, says Stephanie. One of them is going out and looking at trees: “We hold the leaves. This one is green. This one is brown. It doesn’t cost anything.”

John Yemma is the editor of The Christian Science Monitor. 

Read entire post | Comments

A converted school bus serves as a mobile classroom in the slums of Hyderabad, southern India. (Krishnendu Halder/Reuters)

Beyond education: How do you build geniuses?

By Editor / 01.17.12

Benjamin Franklin enjoyed no familial boost of wealth, fame, or education. He was one of 17 kids. His father was a candle-maker. In today’s terms, he started life well within the 99 percent. But through systematic study, practice, thrift, and most of all an optimistic embrace of possibility, he rose to become the famed intellect, scientist, statesman, diplomat, and writer honored on currency, in statues, and by namesake institutes dedicated to higher education. 

Everybody should be a Franklin. India needs up to 100 million Franklins.

The world’s most populous democracy has been growing at a healthy clip but a moment of truth is looming. In a mere eight years, 100 million more people – equivalent to the population of Mexico – will enter the Indian workforce. Vanishingly few of them will enjoy a familial boost of wealth, fame, or education. 

In the old India, those 100 million would be a nightmare, a tsunami of humanity racing toward an overburdened infrastructure and social system. In the new India, they are possibly something else. It all depends on where they sit on the scale that runs from basic need to productivity to creativity. It all depends on what the magic of education can do for them.

Let me take you on a detour to explain why this is so important. In the late 1980s, I was on a reporting assignment in Japan, which at the time was considered an economic superpower in the way that China is today and India hopes to be. There was no doubt about the productivity of Japanese workers. The watchword at the time was kaizen, meaning “continuous improvement,” and Japanese workers seemed to make everything better – TVs, cars, Walkmans, fax machines. 

But Naohiro Amaya, a government adviser on education, was worried. From the mid-19th century until the late 20th century, he said, Japan had succeeded in producing a high number of moderately educated people. That helped raise the overall capabilities of Japan as it transitioned from feudalism to industrialization. “Standardized people of pretty high quality worked pretty well,” he said, “but now they won’t meet the demand of future Japanese society.” It was not enough to be better and better at what already existed. Japan needed Franklins – innovators who would ask basic questions, experiment, see the unseen. Without breakthrough thinkers, the Japanese miracle would stall. 

Japan has been stalled for two decades. Now think of the stakes in India: Japan’s population is decreasing; India’s is growing faster than that of any other nation on the planet. 

Some nations are blessed with natural resources. Some have a legacy of wealth by virtue of past conquests or economic achievements. But the ones that prosper generation after generation have a culture of Franklins – smart people eager to get smarter. Economist Gary Becker of the University of Chicago, a pioneer in the study of human capital, notes that “large increases in education and training have accompanied major advances in technological knowledge in all countries that have achieved significant economic growth.”

India’s challenge is epic in scale. The Monitor’s Ben Arnoldy, who has just completed a three-year assignment in India, produced a special report in what India faces. Ben notes, “Aside from the eye-popping number of colleges India hopes to build – some 50,000 in a decade – the country faces a challenge of reforming how it teaches to produce knowledge workers. The good news on this front is that many Indians love to debate and argue. The challenge is to get teachers who will allow that energy into the classroom.”

Education of any kind helps. But as Naohiro Amaya knew, the best kind of education doesn’t just produce productive workers. It frees thought. That is what India – and every other nation – needs most.

John Yemma is the editor of The Christian Science Monitor. To comment on this column, please e-mail editor@csmonitor.com. 

Read entire post | Comments

A Mayan statue stands in Playa Del Carmen, on the Caribbean Coast. (Irael Leal/AP )

At the dawn of 2012: future imperfect

By Editor / 01.03.12

We can laugh about it now, but in 1999 many serious people were concerned about the Y2K computer bug the bringing modern world to its knees. Combined with the millennial turn of the calendar, Y2K prompted all sorts of predictions about what the new year would bring. Some were optimistic. Some were ominous. 

 Jan. 1, 2000, dawned. Nothing bad happened. Perhaps all the worry caused responsible people to fix their computers. Or maybe it was just hype to begin with. The main point is that no one got the future right. 

 We seldom do. In his 2011 book, "Future Babble: Why Expert Predictions are Next to Worthless, and You Can Do Better," Dan Gardner explores famous forecasts and why they failed. There are many reasons we get the future wrong, he says. For one thing, we predict based on the present, so the future is an extrapolation, a pumped-up version of today. That's why in the 1930s prognosticators saw flying cars but not the Internet, enormous battle tanks but not stealth drones. 

 We also generally stay with conventional wisdom. If everybody in the late 1980s says the Japanese economy is unstoppable, no one breaks from the pack to predict the lost decades of the '90s and '00s. What does that say about China now? "Take a coin from your pocket," Mr. Gardner writes. "Flip it. You'll have a 50 percent chance of being right, which is as good as that of the experts."

 That wisdom can be applied to the stock market, to politics, technology, consumer tastes, and most complex events. So why bother predicting? Because we have no choice. We try to make sense of the unfolding future as we travel toward it. Gardner's best advice: Bring a large suitcase of humility on your journey. "Stride confidently forward in the dark and you're likely to feel quite pleased with yourself right up until the moment you walk into a wall."

At the turn of the year, Monitor correspondents took stock of the big stories of 2011: post-occupation Iraq, Afghanistan, the Arab uprisings, Europe's financial crisis, affluent China's newfound soul-searching, the economic rise of Latin America, Africa's surprising self-help boom. Each article stretches forward as far as possible. But by mid-January, we admit, the world may look different in ways we can't predict. 

We know a few things for sure about 2012: Madonna is set to perform at the halftime of the Super Bowl; the Summer Olympics kick off in London in July; NASA's "Curiosity" rover is due to land on Mars in August; the US presidential election takes place on Nov. 6.  Oh, and pencil in the end of the world for December. 

 Not really. I'm only mentioning the end-of-world because in 2012 you are going to hear a fair amount of buzz about it. Remember how the Gregorian-calendar millennium occasioned all sorts of anxiety? A similar subcultural stir surrounds 2012 and the Mayan calendar.

 Now, the Mayans had a fascinating civilization and left behind stunning ruins in the jungles of central America. They were pretty good astronomers. But most civilizations have had pretty good astronomers, the better to predict when to plant and harvest. Even if the Mayan calendar is due to click over in 2012, there's no evidence they saw the world ending, despite what faddists say. And there's no evidence they were any better at predicting the future than we are. 

 Jan 1, 2013, will dawn. Life will go on.

 Why be so confident? Because a strong argument for life going on can be found in the lives and aspirations of people who are building the future. See, for instance, this report (click here and here and here) profiling 30 remarkable people under 30 years old. Each in her or his own way is casting a line into the mid-21st century -- in agriculture, social media, green transportation, the arts, human rights, information technology, politics. 

 If you must extrapolate about the future, you couldn't do much better than to start with the clever, hopeful, humanity-embracing ideas that these 30 under 30 -- and millions like them -- are pursuing.

Read entire post | Comments

  • Weekly review of global news and ideas
  • Balanced, insightful and trustworthy
  • Subscribe in print or digital

Special Offer

 

Editors' picks:

Become a fan! Follow us! Google+ YouTube See our feeds!