Editor's Blog
Boston Kindergarten teacher Kerrin Flanagan prepares her classroom for new students. (Joanne Ciccarello/The Christian Science Monitor)
Teachers who excel: A lesson from Miss Smoot
Where does intelligence come from? Biologists look to organic structures. Psychologists study influences and experiences. Theologians look to the spiritual.
All can agree that intelligence needs care and feeding. For that we have teachers to thank – parents, an aunt or uncle who takes an interest in us, a religious guide or workplace mentor, a thoughtful friend.
That’s the informal network. The formal one is the subject of this week’s cover story: schoolteachers. There are more than 7 million teachers in the United States, and millions more abroad. Teachers, writes Monitor staffer Amanda Paulson in a Monitor cover story, are the most important factor in student learning – more important than textbooks, tests, computers, classrooms, peer groups, study guides, or any other aspect of the education industry.
Almost all of us have felt the embers of intelligence glow because of a teacher’s careful attention. For me – and for generations of 12th-graders at William B. Travis High School in Austin, Texas – Jane Smoot was that teacher. Her love of prose and poetry inspired students across four decades. Her enthusiasm when a young writer assembled a composition in a way that conveyed the essence of an idea made you want to do even better next time.
But what makes a good teacher? And how can we make more of them? Amanda seeks answers. In our age of metrics, it is tempting to impose a formula for teaching excellence. As we know from standardized testing of students, data can be useful in charting progress (or slippage) over time and ensuring that no student is left behind. But metrics cannot capture the quicksilver of excellent teaching. For that, more qualitative observation is needed. Amanda explores that complex and controversial process.
For extra credit in understanding where excellence comes from, I recently checked out a book called “Burned In: Fueling the Fire to Teach.” It is a series of short essays by teachers about what motivates them, with an emphasis on how to avoid the too-
common problem of burnout. In the first three years of their careers, half of all teachers get discouraged and quit. That’s an alarmingly high attrition rate. They feel dispirited, disrespected, or out of their depth. They feel impoverished, frustrated, swamped by paperwork and meetings. They usually feel all of those burdens simultaneously.
And yet, as Michael Dunn, a veteran high school English teacher, marvels, think of what they do: Each fall “there’s a whole new crop of human beings to grow, who never knew they could write a sonnet, or how effectively they are ‘played’ each day by advertisers, or how blessed they are to be able to question their government, or how passionate Emily Dickinson really was.…”
Rosetta Marantz Cohen, a professor of American studies and education at Smith College in Northampton, Mass., has studied teachers who have stayed happy and committed. They are sustained, she says, not just by the mission of nurturing young minds, or by the high calling of safeguarding civilization. Most often, she finds, they love their subject – literature, language, chemistry, math. They pursue their subject in and outside the classroom.
Students know when teachers teach what they love. We knew that about Miss Smoot. You probably have a Miss Smoot in your life, too. Thank that person for stirring the embers of your mind.
John Yemma is editor of the Monitor. You can reach him at editor@csmonitor.com.
Terry Willey lights a candle during an Aurora, Colo., church service in support of shooting victims and others. (Barry Gutierrez/AP)
After Aurora: the role of media violence
Some of the most celebrated TV shows of this era are also some of the most violent: “The Wire,” “Breaking Bad,” “The Sopranos.” It’s the same with movies – from “Pulp Fiction” to “Batman.” And video games. And books. Violence has a long and rich tradition in the arts as a plot accelerant and drama intensifier. Shakespeare employed it liberally. Homer, too.
But for as long as there have been arts, people have wondered what effect violent content has on behavior. Plato wanted to keep young people away from intense plays, which he felt skewed proper character development. His student Aristotle disagreed, arguing that drama had a cathartic effect, channeling off anger and aggression.
We’re a little closer today to settling the issue, which often arises after an act of mass violence such as the one in Aurora, Colo.
Common sense says make-believe images must have an effect. If the media were unable to sway thought, a hole the size of Gotham City would be blown in the advertising industry. There would be no reason to saturate the airwaves with political ads, sponsor sports spectacles, or launch big marketing campaigns.
A mountain of behavioral studies has been amassed in recent years showing that media messages are, indeed, persuasive. That goes not just for consumer choice. A 2006 review by L. Rowell Huesmann and Laramie D. Taylor for the University of Michigan’s Institute for Social Research pulled together dozens of careful studies. Conclusion: “Media violence poses a threat to public health inasmuch as it leads to an increase in real-world violence and aggression.” Every category of violent imagery had an effect: fictional TV and film, TV news, video games.
At the same time, recent neuroscience experiments appear to indicate that prolonged exposure to violent and aggressive imagery decreases inhibitions about actual violence and aggression. While all such research remains provisional, the preponderance of evidence does seem to side with Plato, not Aristotle.
Prof. James Klagge, who teaches philosophy at Virginia Tech, has thought a lot about this age-old debate. Even Plato, in his desire to edit and censor, he notes, couldn’t come up with a mild alternative for young people to experience. He endorsed “The Illiad” and “The Odyssey,” which are every bit as bloody and intense as anything Christopher Nolan or Quentin Tarantino would produce today (e.g., the grim story of Polyphemus, the cyclops).
There’s little, it seems, we can do to rid our lives of violent themes. Professor Klagge points out that anyone who has tried to raise boys in an environment free from war toys or playtime shootouts almost always sees these influences assert themselves anyway. And if you recall the Bible story of Cain and Abel, the first two boys on earth, you know that Cain didn’t go to the
movies or play Mortal Kombat to learn how to slay his brother.
So while a constant diet of violent images can’t be good for you, censorship doesn’t seem to work either. Which leaves – what?
How about moral education? The study and practice of doing what is right is not a silver bullet in a bullet-riddled age of entertainment. It is a long-term defense project – as old as religion and philosophy – designed to counter dark thoughts of violence and aggression and build a culture of character, honor, and respect.
John Yemma is editor of the Monitor.
Sunrise over the J Bar L ranch and Red Rock Creek in Centennial Valley, Montana. ( Ann Hermes/Staff )
Tracing America's green roots
The two rival strands of American environmentalism – nature untouched versus nature managed – can be traced back to John Muir and Gifford Pinchot.
Muir, founder of the Sierra Club, was a purist. Brought up in a strict religious household, he found spiritual uplift in wilderness, especially in the American West. The mountains and streams of the Sierra Nevada were his church; the forest was sacred. He wanted nature reserves left alone and believed the only resource humans should harvest from them was the restoration of the soul.
Pinchot, the first head of the US Forest Service, was pragmatic. The son of a wealthy developer of land and lumber, he saw forests and wild lands as assets to be exploited – albeit carefully and with consideration of the needs of future generations. Conservation, to him, was not about sequestration and prohibition. It was husbandry on a grand scale.
Let’s be honest. It is impossible to choose either philosophy exclusively. A cathedral of pines is at least as magnificent as Notre Dame. No skyscraper can compare to a mountainside bathed in sunrise. An alpine lake happened upon after a long hike; a sea of undulating prairie grasses; a waterfall – almost any waterfall – these are psalms for the human heart.
And how do humans get to experience them? Probably by burning nature’s hydrocarbons, drinking its water, and somewhere along the way employing its minerals and timber in support of life and livelihood. We may drive a hybrid, choose organic vegetables, and scrupulously recycle, but even the greenest of us has to admit that natural resources feed the superstructure of the civilization in which we live.
We are John Muir when we take a weekend walk and are awestruck by an encounter with a fawn. We are Gifford Pinchot when the alarm goes off on Monday morning. Every one of us balances purist aspirations with practical needs.
In a Monitor cover story, Todd Wilkinson explores that balance, focusing on ranching in the West. He introduces us to a new generation of ranchers who are concentrating on sustainable practices. Cattle lands that once would have been trampled and depleted are being managed in smart new ways that decrease the environmental impact and allow the region’s native flora and fauna to thrive.
Unlike the “sagebrush rebels” of a generation ago who saw environmentalism as silly and intrusive, these green ranchers consider healthy land and water crucial to current and future generations. As one rancher tells Todd: “Lots of different people talk about ways that agriculture needs to be sustainable, but we are living it.”
Todd knows the West. In a cover story last summer, he examined the complexities and tensions that have accompanied the return of wolf populations in the region. Attacks on cattle or sheep have been of particular concern. His green-ranching report is, in a sense, a follow-up. Sustainable ranching, it turns out, may offer a solution: When wildlife such as deer can find clean water, rich forage, and adequate cover on ranches, they flourish – and wolves have a shot at traditional prey rather than going after livestock.
Green ranching is where Muir and Pinchot blend. Humanity and nature can’t be separated. But nature can be handled with care.
John Yemma is editor of the Monitor.
Sharolyn Scott, a 400-meter Hurdler from Costa Rica, trains for the London Games. (Juan Carolos Ulate/Reuters)
The end of 'faster, higher, stronger?'
"Faster, Higher, Stronger.” The Olympic motto captures the pure athleticism we will witness after the great caldron is lit in London July 27. Individuals and teams from around the world will push themselves past the limit of human endurance to win a gold medal and, if possible, set a world record.
Olympians – and we lesser specimens watching from our sofas – start with roughly the same factory-supplied bodies, breathe the same air, navigate the same terrestrial conditions. Their physical feats, however, put them on the far end of the human bell curve.
Still, while dramatic contests will take place and heartbreakingly close races will occur during the Games, scientists increasingly believe humans are hitting
limits of unassisted athletic performance. There’s only so much, it seems, that Homo sapiens’ muscles and lungs can accomplish.
Specialists at France’s National Institute of Sport, Expertise, and Performance have examined the top 10 performances in track and field and swimming during the modern Olympics era (1891 to 2008) and found that most records have stagnated since 1988. “Our physiological evolution will remain limited in a majority of Olympic events,” they concluded in their 2010 report.
Records may still be broken. In rare cases, extremely exceptional individuals may still emerge. High-tech footwear and no-drag swimsuits may provide a temporary edge – until everyone gets them or they are banned. The latest performance-enhancing drugs and futuristic procedures such as gene therapy may assist a few unscrupulous athletes. But as Peter Weyand, a biomechanics and physiology professor at Southern Methodist University in Dallas, wrote in a 2009 Monitor commentary, record-setting may “no longer lie within the traditional limits of human biology.”
If humanity’s unassisted quest for statistical glory is all but over, that may not be a bad thing. At the world-record level, microscopic differences in finishing times are increasingly trivial. Anything from a bad night’s sleep to a change in the weather can shave microseconds off a performance. The American swimming phenomenon Michael Phelps had an amazing comeback in the 4x100-meter freestyle relay in the 2008 Games. But his comeback was the story. His 1/100th-of-a-second victory over Serbian swimmer Milorad Cavic was so close that the race was essentially a tie.
The gracious Mr. Cavic epitomized the Olympic spirit when he said he was fine with the result because “there’s nothing wrong with losing to the greatest swimmer there has ever been.” So maybe the old ideal of “how you play the game” is beginning to supplant the relentless drive to win.
In a Monitor cover story (click here to read it), we introduce you to eight remarkable Olympians from around the world – among them, a 71-year-old Japanese dressage competitor, a Somali runner who braved the mean streets of Mogadishu to train, a Massachusetts judo athlete who overcame abuse by a coach, a gentle-giant Iranian weightlifter, a female wrestler fighting taboos in India.
It is a cliché to say that these and others who make it to London win just by showing up. They have trained and are hungry. They want a medal. But their individual achievements are only part of what they are. Their back stories, not their superhuman skills, help the rest of us understand what the Olympics are about.
Sunday:
Gladys Tejeda: Getting to the Olympics on borrowed shoes
Monday:
Hiroshi Hoketsu: A Japanese Olympian defies the age barrier
Kayla Harrison: An American Olympian rebuilds a life through judo and friends
Tuesday:
Mohamed Hassan Mohamed: Training for the Olympics in the shadow of war
Behdad Salimi: An Iranian Olympian carries the weight of a nation
Wednesday:
Yamilé Aldama: A British track star jumps through a tough decade
Geeta Phogat: How an Indian wrestler defied gender taboos
Thursday:
Tahmina Kohistani: Afghan sprinter tries to beat the clock - and pollution
To each his own niche
We live in a world that prizes e pluribus more than unum. Most of us bristle when told “if we make an exception for you, we’d have to make one for everybody.” Judge us by where we live, what we drive, or how we dress, walk, talk, or look? We may not be flinty pioneers, but we are definitely not part of the vast, like-minded herd.
Marketing experts know we cherish our individualism. They have developed ever more sophisticated ways to track our interests and tailor messages to our wants and needs. When you think about it, though, what they are doing still amounts to lumping us in with a crowd. It’s just that there are thousands of boutique crowds now rather than a mass market. You might be an extreme commuter, one of the working retired, an urban locavore, a young knitter who posts on Pinterest. And there might be a million of you.
Ten thousand niche markets have replaced the old mass market. These niches need not be large to pack a punch. A few thousand people can jam city hall plaza, spread the word about a hot restaurant, or spark an online rush toward a social media site only the cool kids now visit.
“The world may be getting flatter, in terms of globalization, but it is occupied by 6 billion little bumps who do not have to follow the herd to be heard,” pollster and political strategist Mark Penn wrote in his 2007 book, “Microtrends: The Small Forces Behind Tomorrow’s Big Changes.”
Mr. Penn coined the term “soccer moms” in the mid-1990s when he helped secure that niche for President Clinton’s reelection. That was before the Internet birthed a hyper-niche America. So who are the soccer moms of 2012? Some of the demographics being mentioned this time around include “Medicare grandmas” concerned about health-care costs, underemployed Millennials digging out from college debt while living with their parents, Latino entrepreneurs far from border states, lapsed church-goers, even pet lovers (who may be dismayed or just amused that a certain candidate once made the family dog ride on the roof of the family car).
How will these and other niches vote in the US presidential election? In a Monitor cover story, Jennifer Skalka Tulumello goes behind the scenes with Gallup pollsters who are trying to determine what Americans are thinking. Large percentages have already decided how they will vote. The battleground is over undecideds in swing states such as Virginia, Colorado, and Wisconsin. Jennifer’s report examines the techniques, trends, and reliability of polling.
The 2012 electorate is vastly different from when George Gallup’s troops first went door to door. Campaigns now need exotic digital tools to find and motivate voters. But on the first Tuesday of November, a curious event will occur. Dot-moms and DIY dads, war veterans, public-sector workers, ex-urban retirees, small-town landlords, and millions more niche Americans will enter voting booths. For a single day, they will become a single mass market: a democracy. Out of many, one.
Soldiers at Fort Bragg, N.C., listened to President Obama speak during a 2011 visit. (Kevin Lamarque/Reuters/File)
Women warriors: How close to combat?
When you think about the growing role of women in combat in the US military, it’s probably best not to envision either “G.I. Jane” or “Private Benjamin.” Women in the military are neither celluloid heroes nor sad sacks. The warriors Anna Mulrine interviewed in a Monitor cover story found simply want equality: the same rules, regulations, and duties as men, even if it means the same danger, tedium, and privation that men have long endured on the front lines.
You’ll meet Lt. Col. Tammy Duckworth, who lost both legs when her Black Hawk helicopter was hit by a rocket-propelled grenade in 2004 in Iraq. To her, the question is not whether combat is too risky for women or whether mixed genders might hurt unit cohesion. “Trust me,” she says, “to have the intelligence to assess the risks and decide to take them in order to have the amazing privilege of serving my country.”
Combat, after all, is risky for everyone. Which is why all enlistees, regardless of gender, deserve to be treated with respect by their comrades in arms, the officers they report to, and the society they serve. Too often, that has not been the case, as seen in “The Invisible War,” a documentary on rape and sexual harassment in the military (see this review by the Monitor's Peter Rainer).
As a Pentagon correspondent, Anna Mulrine has lived on the front lines. In 2006, she bivouacked in the embattled Iraqi city of Fallujah. In 2011, she operated out of a NATO outpost in Afghanistan’s Helmand Province. There were uncomfortable moments, she notes, but not always in ways you might think. For instance, matters like hygiene and privacy were nonissues. “When it came to bathrooms, it turned out that the guys wanted privacy for themselves even before they had to accommodate women. So in Fallujah they put up a plywood wall, and in Helmand there was a separate tent.”
Sure, gender distinctions remained. Young men were reluctant to share vivid details of combat with her. But they often opened up about their families and fears. And when it came time to face danger, most differences faded. With everyone suited up in body armor and helmets, lugging their gear, Anna says, “it is also almost impossible to tell Frank from Steve or Steve from Jennifer.”
•••
You might also enjoy Amy Black’s commentary, which explodes the belief that everybody in the US electorate has become politically polarized. The less appreciated development, she writes, is the rise of independent voters. And guess what they want? Not shouting, gotchas, or tallying slights. They want moderation. While loud voices and clever put-downs get a lot of air time, voters consistently keep their own counsel and choose wisely. That’s worth remembering as Republicans and Democrats – and the political-action committees that love them – spend millions on campaign ads.
With that in mind, this is probably a good time to remind readers that the Monitor sides with no political party and endorses no candidates. While it is impossible to balance every sentence of every news story so that left and right get exactly equal ink, you’ll see that we maintain balance over time. Like the voters that Professor Black writes about, we are biased toward independence – independence of thought. If you believe we are falling short in some regard, please let me know.
John Yemma is editor of the Monitor. He can be reached at editor@CSMonitor.com.
An Afghan family breaks the fast for Ramadan with an evening meal in Kabul, Afghanistan. (Melanie Stetson Freeman/Staff/File)
Dinner is not just for dining
How mixed-up is our dining history? To start with, the word “dinner” comes from the Latin (via the French) for breakfast. In some centuries and cultures, fasts were broken at dawn. In others, stomachs growled until midday, which is why petit déjeuner was invented.
The evening meal has always varied depending on work hours, preparation time, and, most important, available light. In his excellent domestic history, “At Home,” Bill Bryson notes that if you open your refrigerator door “you summon forth more light than the total amount enjoyed by most households in the eighteenth century.” Darkness and dining were not happy partners. So it wasn’t until the middle of the 20th century, with the eight-hour workday and the spread of indoor lighting, that we settled on breakfast, lunch, and dinner at morning, noon, and evening.
You know the drill, of course: Breakfast is the eye-opener, the fresh start on the day with juice, muffin ... and, whoa, look at the time! Lunch is the pit stop – sandwich, salad, or soup wolfed down, often while doing business, far too often at a desk. What did you have for lunch, dear? I really can’t remember.
Then there’s dinner. If the other meals are more or less forgettable, dinner is the real deal. It’s where the fast, for all intents and purposes, is finally broken, the meal that most people make an effort with, even if that means just deciding what toppings to put on a pizza. Dinner has possibilities. It isn’t strictly time-limited. It can relax into the evening.
Dinner is a time to talk about subjects deeper than the daily to-do list or office politics. More than any other meal, it preserves the essential aspects of communion. There might be candlelight to invite intimacy, as in Virginia Woolf’s touching description of a memorable dinner party in “To the Lighthouse.” But any mood lighting will do. The menu need not be as fantastic as in “Babette’s Feast,” but chopping and sautéing are nice moves. Martha Stewart isn’t required to bless the place settings; but sitting down, using a plate, and wielding utensils properly is commendable.
The main thing is a decision by all who dine together to draw close and share something of themselves over food – something not overly contentious: a school project, a challenge at work, a discussion of values or relationships. Well, maybe think twice about discussing relationships. You also might want to tiptoe through topics like money. And definitely be careful around politics. Vietnam disrupted many a spaghetti dinner when I was a kid.
Done right, dinner isn’t just a good time. Mary Beth McCauley’s Monitor cover story explores new research showing that children from families that dine together have lower rates of substance abuse, fewer eating disorders, and better grades in school. Dinner is both good for you and crucial to bonding with loved ones. See Mary Beth’s portrait of the 15-member family that Barbara and Bill Walsh nourished over the decades, the older siblings pitching in to help the younger ones. To this day, everyone has remained close. (I especially liked reading that Barbara and Bill made a point of going to dinner once a week – just the two of them – ensuring time for their essential bond.)
A proper dinner doesn’t have to happen every night. Nor does it require a big family. It can be with a friend. Or even alone. But you do have to slow down and be intentional about it. On any given day, dinner may be the last best hope that heart will speak to heart – or at least that you’ll taste your food – before the lights go out.
John Yemma is editor of the Monitor.
A view of the lawn at the University of Virginia as seen from The Rotunda. (Andy Nelson/The Christian Science Monitor/File)
College: more than a credential
College is like the mythical Scottish village of Brigadoon. It comes alive in a fleeting, magical way for entering freshmen and vanishes into the mists roughly four years later when the caps and gowns are returned and only memories and debt remain.
Oh sure, faculty and staff work at colleges year in and year out. Perennial students can be found there, too, along with buskers, landlords, and shopkeepers. But college is mostly about young people coming of age, grappling with new ideas, learning useful skills, and networking with contemporaries who may always be friends (and may also end up knowing something they can hold over you for the rest of your life).
Colleges are the membrane through which the accumulated knowledge of humanity is transmitted from one generation to the next, along with hacky sack, foosball, and frisbee. The process works best via a professor, a teaching assistant, a set of books, and a series of lab experiments. But some of the transfer inevitably occurs via CliffsNotes, last-minute cramming, and late-night talkathons. When a bachelor’s degree is awarded, the transaction is more or less complete – which is good but may not be enough anymore to make it in the job market.
In a Monitor cover story, Lee Lawrence looks into the worth of a bachelor’s degree. Where once a bachelor’s could open doors, it has become so commonplace that it might not be enough to land a job. On the one hand, graduate-degree holders may have a leg up; on the other hand, vocational skills alone may be a surer way to a paycheck. But while a bachelor’s may have become devalued, it is a minimal requirement in most jobs, a steppingstone to graduate credentials, and crucial for that little matter of civilization.
Columbia University professor Andrew Delbanco, in a new book titled “College: What It Was, Is, and Should Be,” points out that students “have always been searching for purpose. They have always been unsure of their gifts and goals, and susceptible to the demands ... of their parents and of the abstraction we call ‘the market.’ ” He cites Harriet Beecher Stowe’s 1871 description of a man entering college when everything was “distant, golden, indefinite, and I was sure I was good for almost anything that could be named.” But he soon began to wonder about “all the pains and money” expended on his education.
And yet almost everyone who emerges from college is equipped with the modicum of critical thinking necessary to participate in a democracy and to appreciate life more fully. “Anyone who earns a BA from a reputable college,” Professor Delbanco says, “ought to
understand something about the genealogy of ... ideas and practices, about the historical processes from which they have emerged, the tragic cost when societies fail to defend them, and about alternative ideas both within the Western tradition and outside it.”
I’ve found myself on campuses in Cairo, Moscow, and Baghdad. I’ve seen the western sun paint gold the university buildings on Jerusalem’s Mount Scopus, walked along the scholar-scuffed halls of Magdalen College at England’s Oxford University, and felt the same ephemeral magic in Lubbock, Texas; Amherst, Mass.; and midtown Manhattan. A degree is only part of what a student takes from these places. The rest – the appreciation of the past, the enrichment of literature, the windows opened in a thousand minds – that is what a BA means, too.
John Yemma is editor of The Christian Science Monitor. Email: editor@csmonitor.com.
Small shops sell food and snacks on a hillside in Kabul, Afghanistan, where squatters have settled. (Melanie Stetson Freeman/Staff)
Does nation-building work?
Nation-building has a can-do ring to it. You can build a highway, a skyscraper, a Fortune 500 company. Why not a nation?
It isn’t a new idea. Throughout the 20th century – in places as different as Germany, the Philippines, Iraq, Japan, and Kosovo – world powers have worked to turn broken states into healthy ones through a combination of outside force, inside management, and the cultivation of civil society, education, rule of law, and democratic institutions. Soldiers and civil servants have sacrificed their lives. Billions of dollars have been spent.
The outcomes have been mixed, as James L. Payne noted in a 2006 study published in the Independent Review. Some nations (Somalia) reject the effort. Others make it (Austria, Germany, Japan), but we can’t be sure it was due to intervention or popular will. Nation-building works best when insiders take the lead. Some states fail and re-fail and then pull it together (Dominican Republic, Panama – and possibly Haiti, and even Somalia is improving).
It’s easy to criticize nation-building as western hubris. When he ran for president in 2000, George W. Bush argued that the United States shouldn’t be imposing its values on the rest of the world. That changed after 9/11.
“Afghanistan was the ultimate nation-building mission,” Mr. Bush wrote in his memoir. “We had liberated the country from a primitive dictatorship, and we had a moral obligation to leave behind something better.” Moral obligation, especially after a war, outweighs hubris.
A 2007 RAND Corporation guide to nation-building notes that US-led military interventions are running at about one every two years, and new United Nations peacekeeping missions occur every six months.
In little more than a decade, nation-building efforts were launched in Kuwait, Somalia, Haiti, Bosnia, Kosovo, Afghanistan, and Iraq. Note the ascending size of the nations involved. There’s a warning for the future in that, especially at a time of war-weariness and constrained budgets. As the RAND study observed, “the effort needed to stabilize Bosnia and Kosovo has proved difficult to replicate in Afghanistan or Iraq, nations that are eight to 12 times more populous. It would be even more difficult to mount a peace enforcement mission in Iran, which is three times more populous than Iraq, and nearly impossible to do so in Pakistan, which is three times again more populous than Iran.”
In a Monitor cover story, Scott Baldauf takes us to Afghanistan to assess whether the nearly 11-year nation-building process has “taken” well enough that the South Asian country can survive as a tolerant, viable society when NATO scales back in 2014. The pitfalls are plentiful: ethnic animosity, warlords, the Taliban, corruption, opium, external meddling. Most Afghans Scott talked with want foreign troops out. Fewer Afghans want foreign aid to decrease. And almost everybody is worried about what comes next.
As Scott’s reporting and Melanie Stetson Freeman’s photography show, Afghanistan has changed markedly since 2001. If the yearning for peace and normalcy alone could determine a country’s future, Afghanistan would make it. Afghan won’t be totally on its own. NATO contingents will remain, civilian assistance will continue, and $4 billion a year is being promised to bankroll Afghan security forces. Have we left behind something better? We are about to find out.
John Yemma is editor of The Christian Science Monitor.
Volunteers place wood floor planks for a house under construction a year after a tornado tore through Joplin, Mo. (Eric Thayer/Reuters)
Doing well by doing good
Is altruism good, or just a good strategy? The biologist E.O. Wilson has described it as both. The purest form of altruism involves self-sacrifice for others – a family, tribe, or cause – with no expectation of reward. The other altruism, which seems less noble, is doing a favor to get a favor.
Poems and statues are dedicated to uncompromising heroes. Most people are turned off by back-scratching deal-makers. But wait. Pure altruism, Dr. Wilson pointed out in his book “On Human Nature,” advances the goals only of a narrow set of people. The hive benefits when a honeybee gives up its life to defend the group. By contrast, dealmaking altruism may look cheesy and hypocritical, but it is essential in a complex, diverse society.
Hard altruism is the stuff of legends. Soft altruism is the infrastructure of civilization.
A family, company, cause, or ideology needs hard altruism to cohere. One for all and all for one. But at some point, even the most tightly disciplined island needs to connect to the mainland. Deals must be done, favors returned.
Whatever form it takes, altruism is good for others, for us, and for society, which is why versions of the golden rule and the good Samaritan parable exist in most cultures. In communities where social capital is high, hard times are not as hard.
Periodically, social analysts worry that the wheels have come off society, that economic pressure or new technology or fads or amusements are undermining our social capital, making us more selfish and less caring. In the mid-1990s, Robert Putnam penned a much-talked-about essay titled “Bowling Alone: America’s Declining Social Capital.” Americans were spending so much time in front of TVs and computer screens, Dr. Putnam wrote, that participatory democracy, volunteering in the community, even amiable activities like bowling leagues were withering. If people were not connecting, social capital would wither as well.
And yet one key measure of social capital – rates of volunteerism – has risen steadily since then. More than a quarter of Americans now give their time to charitable causes. Back when Putnam wrote his essay, the term “Generation X” – those people who were born in the 1960s and ’70s – was usually modified by the word “slacker.” But as that generation has matured, its members have been volunteering at rates higher than the national average.
That’s the antithesis of slacking.
Twenty-first century social capital isn’t built so much through bowling leagues and welcome wagons. Among other things, Internet-enabled social networks have become a powerful force for altruism, pinpointing needs and marshaling resources. Struggling individuals and companies, for instance, have benefited from “cash mobs.” Organized online (usually via Facebook), these bring to bear scattered do-gooders in ways that door-to-door solicitation never could.
Altruism may look different today. But whether it is mowing a neighbor’s lawn, giving a renter a break, or cash-mobbing a mom and pop store, altruism flourishes because it helps others and us and civilization. Even if we cherish the island where we were born – even if we would sacrifice everything for family, nation, or honor – we also know that we are part of something larger. We are a piece of the continent, a part of the main.
John Yemma is editor of the Monitor.




Previous




Become part of the Monitor community