The two rival strands of American environmentalism – nature untouched versus nature managed – can be traced back to John Muir and Gifford Pinchot.
Muir, founder of the Sierra Club, was a purist. Brought up in a strict religious household, he found spiritual uplift in wilderness, especially in the American West. The mountains and streams of the Sierra Nevada were his church; the forest was sacred. He wanted nature reserves left alone and believed the only resource humans should harvest from them was the restoration of the soul.
Pinchot, the first head of the US Forest Service, was pragmatic. The son of a wealthy developer of land and lumber, he saw forests and wild lands as assets to be exploited – albeit carefully and with consideration of the needs of future generations. Conservation, to him, was not about sequestration and prohibition. It was husbandry on a grand scale.
Let’s be honest. It is impossible to choose either philosophy exclusively. A cathedral of pines is at least as magnificent as Notre Dame. No skyscraper can compare to a mountainside bathed in sunrise. An alpine lake happened upon after a long hike; a sea of undulating prairie grasses; a waterfall – almost any waterfall – these are psalms for the human heart.
And how do humans get to experience them? Probably by burning nature’s hydrocarbons, drinking its water, and somewhere along the way employing its minerals and timber in support of life and livelihood. We may drive a hybrid, choose organic vegetables, and scrupulously recycle, but even the greenest of us has to admit that natural resources feed the superstructure of the civilization in which we live.
We are John Muir when we take a weekend walk and are awestruck by an encounter with a fawn. We are Gifford Pinchot when the alarm goes off on Monday morning. Every one of us balances purist aspirations with practical needs.
In a Monitor cover story, Todd Wilkinson explores that balance, focusing on ranching in the West. He introduces us to a new generation of ranchers who are concentrating on sustainable practices. Cattle lands that once would have been trampled and depleted are being managed in smart new ways that decrease the environmental impact and allow the region’s native flora and fauna to thrive.
Unlike the “sagebrush rebels” of a generation ago who saw environmentalism as silly and intrusive, these green ranchers consider healthy land and water crucial to current and future generations. As one rancher tells Todd: “Lots of different people talk about ways that agriculture needs to be sustainable, but we are living it.”
Todd knows the West. In a cover story last summer, he examined the complexities and tensions that have accompanied the return of wolf populations in the region. Attacks on cattle or sheep have been of particular concern. His green-ranching report is, in a sense, a follow-up. Sustainable ranching, it turns out, may offer a solution: When wildlife such as deer can find clean water, rich forage, and adequate cover on ranches, they flourish – and wolves have a shot at traditional prey rather than going after livestock.
Green ranching is where Muir and Pinchot blend. Humanity and nature can’t be separated. But nature can be handled with care.
John Yemma is editor of the Monitor.
"Faster, Higher, Stronger.” The Olympic motto captures the pure athleticism we will witness after the great caldron is lit in London July 27. Individuals and teams from around the world will push themselves past the limit of human endurance to win a gold medal and, if possible, set a world record.
Olympians – and we lesser specimens watching from our sofas – start with roughly the same factory-supplied bodies, breathe the same air, navigate the same terrestrial conditions. Their physical feats, however, put them on the far end of the human bell curve.
Still, while dramatic contests will take place and heartbreakingly close races will occur during the Games, scientists increasingly believe humans are hitting
limits of unassisted athletic performance. There’s only so much, it seems, that Homo sapiens’ muscles and lungs can accomplish.
Specialists at France’s National Institute of Sport, Expertise, and Performance have examined the top 10 performances in track and field and swimming during the modern Olympics era (1891 to 2008) and found that most records have stagnated since 1988. “Our physiological evolution will remain limited in a majority of Olympic events,” they concluded in their 2010 report.
Records may still be broken. In rare cases, extremely exceptional individuals may still emerge. High-tech footwear and no-drag swimsuits may provide a temporary edge – until everyone gets them or they are banned. The latest performance-enhancing drugs and futuristic procedures such as gene therapy may assist a few unscrupulous athletes. But as Peter Weyand, a biomechanics and physiology professor at Southern Methodist University in Dallas, wrote in a 2009 Monitor commentary, record-setting may “no longer lie within the traditional limits of human biology.”
If humanity’s unassisted quest for statistical glory is all but over, that may not be a bad thing. At the world-record level, microscopic differences in finishing times are increasingly trivial. Anything from a bad night’s sleep to a change in the weather can shave microseconds off a performance. The American swimming phenomenon Michael Phelps had an amazing comeback in the 4x100-meter freestyle relay in the 2008 Games. But his comeback was the story. His 1/100th-of-a-second victory over Serbian swimmer Milorad Cavic was so close that the race was essentially a tie.
The gracious Mr. Cavic epitomized the Olympic spirit when he said he was fine with the result because “there’s nothing wrong with losing to the greatest swimmer there has ever been.” So maybe the old ideal of “how you play the game” is beginning to supplant the relentless drive to win.
In a Monitor cover story (click here to read it), we introduce you to eight remarkable Olympians from around the world – among them, a 71-year-old Japanese dressage competitor, a Somali runner who braved the mean streets of Mogadishu to train, a Massachusetts judo athlete who overcame abuse by a coach, a gentle-giant Iranian weightlifter, a female wrestler fighting taboos in India.
It is a cliché to say that these and others who make it to London win just by showing up. They have trained and are hungry. They want a medal. But their individual achievements are only part of what they are. Their back stories, not their superhuman skills, help the rest of us understand what the Olympics are about.
We live in a world that prizes e pluribus more than unum. Most of us bristle when told “if we make an exception for you, we’d have to make one for everybody.” Judge us by where we live, what we drive, or how we dress, walk, talk, or look? We may not be flinty pioneers, but we are definitely not part of the vast, like-minded herd.
Marketing experts know we cherish our individualism. They have developed ever more sophisticated ways to track our interests and tailor messages to our wants and needs. When you think about it, though, what they are doing still amounts to lumping us in with a crowd. It’s just that there are thousands of boutique crowds now rather than a mass market. You might be an extreme commuter, one of the working retired, an urban locavore, a young knitter who posts on Pinterest. And there might be a million of you.
Ten thousand niche markets have replaced the old mass market. These niches need not be large to pack a punch. A few thousand people can jam city hall plaza, spread the word about a hot restaurant, or spark an online rush toward a social media site only the cool kids now visit.
“The world may be getting flatter, in terms of globalization, but it is occupied by 6 billion little bumps who do not have to follow the herd to be heard,” pollster and political strategist Mark Penn wrote in his 2007 book, “Microtrends: The Small Forces Behind Tomorrow’s Big Changes.”
Mr. Penn coined the term “soccer moms” in the mid-1990s when he helped secure that niche for President Clinton’s reelection. That was before the Internet birthed a hyper-niche America. So who are the soccer moms of 2012? Some of the demographics being mentioned this time around include “Medicare grandmas” concerned about health-care costs, underemployed Millennials digging out from college debt while living with their parents, Latino entrepreneurs far from border states, lapsed church-goers, even pet lovers (who may be dismayed or just amused that a certain candidate once made the family dog ride on the roof of the family car).
How will these and other niches vote in the US presidential election? In a Monitor cover story, Jennifer Skalka Tulumello goes behind the scenes with Gallup pollsters who are trying to determine what Americans are thinking. Large percentages have already decided how they will vote. The battleground is over undecideds in swing states such as Virginia, Colorado, and Wisconsin. Jennifer’s report examines the techniques, trends, and reliability of polling.
The 2012 electorate is vastly different from when George Gallup’s troops first went door to door. Campaigns now need exotic digital tools to find and motivate voters. But on the first Tuesday of November, a curious event will occur. Dot-moms and DIY dads, war veterans, public-sector workers, ex-urban retirees, small-town landlords, and millions more niche Americans will enter voting booths. For a single day, they will become a single mass market: a democracy. Out of many, one.
When you think about the growing role of women in combat in the US military, it’s probably best not to envision either “G.I. Jane” or “Private Benjamin.” Women in the military are neither celluloid heroes nor sad sacks. The warriors Anna Mulrine interviewed in a Monitor cover story found simply want equality: the same rules, regulations, and duties as men, even if it means the same danger, tedium, and privation that men have long endured on the front lines.
You’ll meet Lt. Col. Tammy Duckworth, who lost both legs when her Black Hawk helicopter was hit by a rocket-propelled grenade in 2004 in Iraq. To her, the question is not whether combat is too risky for women or whether mixed genders might hurt unit cohesion. “Trust me,” she says, “to have the intelligence to assess the risks and decide to take them in order to have the amazing privilege of serving my country.”
Combat, after all, is risky for everyone. Which is why all enlistees, regardless of gender, deserve to be treated with respect by their comrades in arms, the officers they report to, and the society they serve. Too often, that has not been the case, as seen in “The Invisible War,” a documentary on rape and sexual harassment in the military (see this review by the Monitor's Peter Rainer).
As a Pentagon correspondent, Anna Mulrine has lived on the front lines. In 2006, she bivouacked in the embattled Iraqi city of Fallujah. In 2011, she operated out of a NATO outpost in Afghanistan’s Helmand Province. There were uncomfortable moments, she notes, but not always in ways you might think. For instance, matters like hygiene and privacy were nonissues. “When it came to bathrooms, it turned out that the guys wanted privacy for themselves even before they had to accommodate women. So in Fallujah they put up a plywood wall, and in Helmand there was a separate tent.”
Sure, gender distinctions remained. Young men were reluctant to share vivid details of combat with her. But they often opened up about their families and fears. And when it came time to face danger, most differences faded. With everyone suited up in body armor and helmets, lugging their gear, Anna says, “it is also almost impossible to tell Frank from Steve or Steve from Jennifer.”
You might also enjoy Amy Black’s commentary, which explodes the belief that everybody in the US electorate has become politically polarized. The less appreciated development, she writes, is the rise of independent voters. And guess what they want? Not shouting, gotchas, or tallying slights. They want moderation. While loud voices and clever put-downs get a lot of air time, voters consistently keep their own counsel and choose wisely. That’s worth remembering as Republicans and Democrats – and the political-action committees that love them – spend millions on campaign ads.
With that in mind, this is probably a good time to remind readers that the Monitor sides with no political party and endorses no candidates. While it is impossible to balance every sentence of every news story so that left and right get exactly equal ink, you’ll see that we maintain balance over time. Like the voters that Professor Black writes about, we are biased toward independence – independence of thought. If you believe we are falling short in some regard, please let me know.
John Yemma is editor of the Monitor. He can be reached at editor@CSMonitor.com.
How mixed-up is our dining history? To start with, the word “dinner” comes from the Latin (via the French) for breakfast. In some centuries and cultures, fasts were broken at dawn. In others, stomachs growled until midday, which is why petit déjeuner was invented.
The evening meal has always varied depending on work hours, preparation time, and, most important, available light. In his excellent domestic history, “At Home,” Bill Bryson notes that if you open your refrigerator door “you summon forth more light than the total amount enjoyed by most households in the eighteenth century.” Darkness and dining were not happy partners. So it wasn’t until the middle of the 20th century, with the eight-hour workday and the spread of indoor lighting, that we settled on breakfast, lunch, and dinner at morning, noon, and evening.
You know the drill, of course: Breakfast is the eye-opener, the fresh start on the day with juice, muffin ... and, whoa, look at the time! Lunch is the pit stop – sandwich, salad, or soup wolfed down, often while doing business, far too often at a desk. What did you have for lunch, dear? I really can’t remember.
Then there’s dinner. If the other meals are more or less forgettable, dinner is the real deal. It’s where the fast, for all intents and purposes, is finally broken, the meal that most people make an effort with, even if that means just deciding what toppings to put on a pizza. Dinner has possibilities. It isn’t strictly time-limited. It can relax into the evening.
Dinner is a time to talk about subjects deeper than the daily to-do list or office politics. More than any other meal, it preserves the essential aspects of communion. There might be candlelight to invite intimacy, as in Virginia Woolf’s touching description of a memorable dinner party in “To the Lighthouse.” But any mood lighting will do. The menu need not be as fantastic as in “Babette’s Feast,” but chopping and sautéing are nice moves. Martha Stewart isn’t required to bless the place settings; but sitting down, using a plate, and wielding utensils properly is commendable.
The main thing is a decision by all who dine together to draw close and share something of themselves over food – something not overly contentious: a school project, a challenge at work, a discussion of values or relationships. Well, maybe think twice about discussing relationships. You also might want to tiptoe through topics like money. And definitely be careful around politics. Vietnam disrupted many a spaghetti dinner when I was a kid.
Done right, dinner isn’t just a good time. Mary Beth McCauley’s Monitor cover story explores new research showing that children from families that dine together have lower rates of substance abuse, fewer eating disorders, and better grades in school. Dinner is both good for you and crucial to bonding with loved ones. See Mary Beth’s portrait of the 15-member family that Barbara and Bill Walsh nourished over the decades, the older siblings pitching in to help the younger ones. To this day, everyone has remained close. (I especially liked reading that Barbara and Bill made a point of going to dinner once a week – just the two of them – ensuring time for their essential bond.)
A proper dinner doesn’t have to happen every night. Nor does it require a big family. It can be with a friend. Or even alone. But you do have to slow down and be intentional about it. On any given day, dinner may be the last best hope that heart will speak to heart – or at least that you’ll taste your food – before the lights go out.
John Yemma is editor of the Monitor.
College is like the mythical Scottish village of Brigadoon. It comes alive in a fleeting, magical way for entering freshmen and vanishes into the mists roughly four years later when the caps and gowns are returned and only memories and debt remain.
Oh sure, faculty and staff work at colleges year in and year out. Perennial students can be found there, too, along with buskers, landlords, and shopkeepers. But college is mostly about young people coming of age, grappling with new ideas, learning useful skills, and networking with contemporaries who may always be friends (and may also end up knowing something they can hold over you for the rest of your life).
Colleges are the membrane through which the accumulated knowledge of humanity is transmitted from one generation to the next, along with hacky sack, foosball, and frisbee. The process works best via a professor, a teaching assistant, a set of books, and a series of lab experiments. But some of the transfer inevitably occurs via CliffsNotes, last-minute cramming, and late-night talkathons. When a bachelor’s degree is awarded, the transaction is more or less complete – which is good but may not be enough anymore to make it in the job market.
In a Monitor cover story, Lee Lawrence looks into the worth of a bachelor’s degree. Where once a bachelor’s could open doors, it has become so commonplace that it might not be enough to land a job. On the one hand, graduate-degree holders may have a leg up; on the other hand, vocational skills alone may be a surer way to a paycheck. But while a bachelor’s may have become devalued, it is a minimal requirement in most jobs, a steppingstone to graduate credentials, and crucial for that little matter of civilization.
Columbia University professor Andrew Delbanco, in a new book titled “College: What It Was, Is, and Should Be,” points out that students “have always been searching for purpose. They have always been unsure of their gifts and goals, and susceptible to the demands ... of their parents and of the abstraction we call ‘the market.’ ” He cites Harriet Beecher Stowe’s 1871 description of a man entering college when everything was “distant, golden, indefinite, and I was sure I was good for almost anything that could be named.” But he soon began to wonder about “all the pains and money” expended on his education.
And yet almost everyone who emerges from college is equipped with the modicum of critical thinking necessary to participate in a democracy and to appreciate life more fully. “Anyone who earns a BA from a reputable college,” Professor Delbanco says, “ought to
understand something about the genealogy of ... ideas and practices, about the historical processes from which they have emerged, the tragic cost when societies fail to defend them, and about alternative ideas both within the Western tradition and outside it.”
I’ve found myself on campuses in Cairo, Moscow, and Baghdad. I’ve seen the western sun paint gold the university buildings on Jerusalem’s Mount Scopus, walked along the scholar-scuffed halls of Magdalen College at England’s Oxford University, and felt the same ephemeral magic in Lubbock, Texas; Amherst, Mass.; and midtown Manhattan. A degree is only part of what a student takes from these places. The rest – the appreciation of the past, the enrichment of literature, the windows opened in a thousand minds – that is what a BA means, too.
John Yemma is editor of The Christian Science Monitor. Email: firstname.lastname@example.org.
Nation-building has a can-do ring to it. You can build a highway, a skyscraper, a Fortune 500 company. Why not a nation?
It isn’t a new idea. Throughout the 20th century – in places as different as Germany, the Philippines, Iraq, Japan, and Kosovo – world powers have worked to turn broken states into healthy ones through a combination of outside force, inside management, and the cultivation of civil society, education, rule of law, and democratic institutions. Soldiers and civil servants have sacrificed their lives. Billions of dollars have been spent.
The outcomes have been mixed, as James L. Payne noted in a 2006 study published in the Independent Review. Some nations (Somalia) reject the effort. Others make it (Austria, Germany, Japan), but we can’t be sure it was due to intervention or popular will. Nation-building works best when insiders take the lead. Some states fail and re-fail and then pull it together (Dominican Republic, Panama – and possibly Haiti, and even Somalia is improving).
It’s easy to criticize nation-building as western hubris. When he ran for president in 2000, George W. Bush argued that the United States shouldn’t be imposing its values on the rest of the world. That changed after 9/11.
“Afghanistan was the ultimate nation-building mission,” Mr. Bush wrote in his memoir. “We had liberated the country from a primitive dictatorship, and we had a moral obligation to leave behind something better.” Moral obligation, especially after a war, outweighs hubris.
A 2007 RAND Corporation guide to nation-building notes that US-led military interventions are running at about one every two years, and new United Nations peacekeeping missions occur every six months.
In little more than a decade, nation-building efforts were launched in Kuwait, Somalia, Haiti, Bosnia, Kosovo, Afghanistan, and Iraq. Note the ascending size of the nations involved. There’s a warning for the future in that, especially at a time of war-weariness and constrained budgets. As the RAND study observed, “the effort needed to stabilize Bosnia and Kosovo has proved difficult to replicate in Afghanistan or Iraq, nations that are eight to 12 times more populous. It would be even more difficult to mount a peace enforcement mission in Iran, which is three times more populous than Iraq, and nearly impossible to do so in Pakistan, which is three times again more populous than Iran.”
In a Monitor cover story, Scott Baldauf takes us to Afghanistan to assess whether the nearly 11-year nation-building process has “taken” well enough that the South Asian country can survive as a tolerant, viable society when NATO scales back in 2014. The pitfalls are plentiful: ethnic animosity, warlords, the Taliban, corruption, opium, external meddling. Most Afghans Scott talked with want foreign troops out. Fewer Afghans want foreign aid to decrease. And almost everybody is worried about what comes next.
As Scott’s reporting and Melanie Stetson Freeman’s photography show, Afghanistan has changed markedly since 2001. If the yearning for peace and normalcy alone could determine a country’s future, Afghanistan would make it. Afghan won’t be totally on its own. NATO contingents will remain, civilian assistance will continue, and $4 billion a year is being promised to bankroll Afghan security forces. Have we left behind something better? We are about to find out.
John Yemma is editor of The Christian Science Monitor.
Is altruism good, or just a good strategy? The biologist E.O. Wilson has described it as both. The purest form of altruism involves self-sacrifice for others – a family, tribe, or cause – with no expectation of reward. The other altruism, which seems less noble, is doing a favor to get a favor.
Poems and statues are dedicated to uncompromising heroes. Most people are turned off by back-scratching deal-makers. But wait. Pure altruism, Dr. Wilson pointed out in his book “On Human Nature,” advances the goals only of a narrow set of people. The hive benefits when a honeybee gives up its life to defend the group. By contrast, dealmaking altruism may look cheesy and hypocritical, but it is essential in a complex, diverse society.
Hard altruism is the stuff of legends. Soft altruism is the infrastructure of civilization.
A family, company, cause, or ideology needs hard altruism to cohere. One for all and all for one. But at some point, even the most tightly disciplined island needs to connect to the mainland. Deals must be done, favors returned.
Whatever form it takes, altruism is good for others, for us, and for society, which is why versions of the golden rule and the good Samaritan parable exist in most cultures. In communities where social capital is high, hard times are not as hard.
Periodically, social analysts worry that the wheels have come off society, that economic pressure or new technology or fads or amusements are undermining our social capital, making us more selfish and less caring. In the mid-1990s, Robert Putnam penned a much-talked-about essay titled “Bowling Alone: America’s Declining Social Capital.” Americans were spending so much time in front of TVs and computer screens, Dr. Putnam wrote, that participatory democracy, volunteering in the community, even amiable activities like bowling leagues were withering. If people were not connecting, social capital would wither as well.
And yet one key measure of social capital – rates of volunteerism – has risen steadily since then. More than a quarter of Americans now give their time to charitable causes. Back when Putnam wrote his essay, the term “Generation X” – those people who were born in the 1960s and ’70s – was usually modified by the word “slacker.” But as that generation has matured, its members have been volunteering at rates higher than the national average.
That’s the antithesis of slacking.
Twenty-first century social capital isn’t built so much through bowling leagues and welcome wagons. Among other things, Internet-enabled social networks have become a powerful force for altruism, pinpointing needs and marshaling resources. Struggling individuals and companies, for instance, have benefited from “cash mobs.” Organized online (usually via Facebook), these bring to bear scattered do-gooders in ways that door-to-door solicitation never could.
Altruism may look different today. But whether it is mowing a neighbor’s lawn, giving a renter a break, or cash-mobbing a mom and pop store, altruism flourishes because it helps others and us and civilization. Even if we cherish the island where we were born – even if we would sacrifice everything for family, nation, or honor – we also know that we are part of something larger. We are a piece of the continent, a part of the main.
John Yemma is editor of the Monitor.
A new generation of spring chickens, fuzzy and endearingly incompetent, arrived at our New England home six weeks ago. They are replacements. During the hungry winter months, a red-tailed hawk visited. Hawks are admirable in their own way. So despite the drama, there are no hard feelings. And because this is the time of year the biota of the Northern Hemisphere bust loose, predators now have other options.
With Memorial Day approaching, we are well beyond spring’s delicate, tentative sprouting – the mix of last year’s memory and this year’s desire, to paraphrase T.S. Eliot. The fiesta of summer is about to begin.
The new chicks have their pinfeathers and are experimenting with flying (which they’ll never win prizes for, but, hey, a hawk can’t hit Mach 1). During the warm-up of mid-May, we moved them to a mini-coop adjacent to the shed that houses the backyard flock we’ve had for the past three years. The class of 2012 (Rhode Island Reds, Wyandottes, and Sex Links) has bonded over water and mash and a popular warming light. Now their range is extending. Ahead lie the treacherous shoals of integration with those who arrived before them.
Precedence is a powerful force. No matter how gradually integration is engineered, there’s no way to avoid issues as older fowl assert their pecking order. No amount of human intervention – not helicopter parenting, not shaming, not a court order – can stop the hazing. Generation 1 did it to Gen 2. Both will do it to Gen 3.
But we intervene anyway. Nature doesn’t get to take its course in a backyard. Without intervention, without the human impulse to adjust the balance, try something different, build or spend or pay without a thought of return on investment, nature would prevail. We’d have no artwork, cultivar roses, sonatas, or soufflés. As seen in a thousand PBS TV shows, nature can be magnificent. But left to its own devices, it can also break your heart.
Which is not what chickens do, unless you are a worm. Chickens are an odd mix of adventurer and comedian. Even their physiques are amusing – plump, friendly bodies that look like miniature Spanish galleons, bracketed by boney talons and wary faces. Happen upon a few in the garden and they’ll shriek and run for their lives, certain you are eyeing them for dinner. But work the ground for a minute and they’re practically riding your spade to get a look at the freshly turned earth.
The best part about chickens (besides the eggs) is encountering them after a long day when they’ve been busy doing their own thing – scratching, pecking, and worrying every interesting inch of a garden. Suddenly, they see you – and you’re a rock star. With no thought for decorum, they race zanily in your direction, a madcap armada hoping for a handful of corn. They practically throw themselves in your path, crouching to be petted. Choose me! No, me!
The other best part is when they put themselves to bed at night, setting aside their political snits and cooing contentedly, as if they are talking about what a great day they had, because, well, they all got to be here at this time and this place, enjoying this sun, and dirt, and these interesting circumstances.
We can have deep and heartfelt differences with each other. We can be Sunni or Shiite, Republican or Democrat, chicken or hawk. Summer is a reminder that we all get to enjoy the same sun warming the same magnificent planet.
John Yemma is editor of the Monitor. You can reach him at email@example.com.
Prisons have been around since the dawn of civilization. For all that time, prisons have been a dilemma.
We lock up murderers, thugs, and thieves both to punish them and to keep them away from law-abiding citizens. Yet prisons are notorious hotbeds of crime, from which first-time offenders too often emerge as hardened criminals. We spend millions persuading prisoners to go straight, giving them occupational training, and coaching them on reentry into civilized society. Yet the mere mention of a criminal record can disqualify a felon from employment, wilt a budding friendship, and relegate an ex-convict to a shadow life of halfway houses, dependence on charity, and possible recidivism.
Well, we tell ourselves, they had it coming; their victims are the ones who really suffered. Lock ’em up and throw away the key. But we also believe in redemption and second chances, at least for ourselves and those we know and love. If anyone close to us spends time behind bars, we experience – and are appalled by – the inhumanity of the penal system, the institution that Nathaniel Hawthorne called the “black flower of civilized society.”
And here’s a practical fact: While there’s no denying that criminal behavior leads to dire consequences, there’s also no denying that the eventual outcome of prison for the vast majority of inmates will be their release back into society. Less than 3 percent of the 1.6 million people in US prisons are serving life sentences without the possibility of parole. The rest will at some point be living in our neighborhoods.
More than ever, that point is now. As Sean Miller reports a Monitor Weekly cover story, record numbers of inmates are leaving prison because of tightening budgets for correctional programs, new thinking about how to handle nonviolent offenders, and the completion of sentences by a bulge of people convicted during the higher-crime, tougher-sentencing era of the 1970s and ’80s.
Sean takes us to California and tracks the difficult post-prison prospects of a handful of men convicted of major crimes – murder, drug trafficking, gang violence. All now say they are sorry for what they did, weary of prison, and ready to abide by the law. “Most of them just got tired of it,” says Sean. “And most of them acknowledge what they’ve done wrong.” They have served their time, sworn themselves to self-improvement, gained job skills, and are hoping for a second chance.
But life after incarceration, which has never been easy, is especially tough in today’s job market. With time on their hands and few options for earning a living, it is too easy for ex-cons to end up hanging out with old friends and returning to bad behavior – especially to drugs, which most abused before and even during prison.
What everyone is worried about, says Sean, is that some felon among the thousands being released will commit a shocking act that tars other ex-prisoners and prompts a backlash against de-incarceration. Fear of a new Willie Horton – whose crimes while on prison furlough became a factor in the 1988 presidential campaign – has police, ex-cons, social workers, and parole officers on edge.
We send men and women to prison when we have despaired of any other way of dealing with their abhorrent behavior. But prison is not a permanent solution. It is at best an opportunity to change a criminal mentality into a moral one. We owe it to the prisoner, the victim, and to us to make that the permanent solution.
John Yemma is editor of the Monitor.