Skip to: Content
Skip to: Site Navigation
Skip to: Search


Upfront Blog

Luxury homes are sold prior to completion in Oceanside, Calif., a sign of better economic times. (Mike Blake/Reuters)

They saw the crash coming

By Editor / 09.08.13

You can’t say we weren’t warned. In the months and years before the 2008 financial meltdown, economic prophets saw trouble ahead. At first they politely begged to differ with the conventional wisdom that good times would roll on and on. Eventually, they were shouting from the rooftops.

As early as 1998, Brooksley Born of the Commodity Futures Trading Commission cautioned that financial derivatives were out of control. Throughout the early 2000s, William White of the Bank for International Settlements repeatedly advised central bankers (most directly in 2003 with Federal Reserve Chairman Alan Greenspan in the audience) that lax monetary policies and poor oversight were generating dangerous asset bubbles.

In 2003, Robert Shiller cautioned that the housing market was irrationally exuberant. Two years later, Nouriel Roubini said a devastating real estate bust was coming. That same year, Raghuram Rajan of the International Monetary Fund raised his voice about risky financial innovations. By then even Mr. Greenspan, the architect of the easy-money policies widely associated with the crash, was telling Congress he was worried, but his worry was the massive mortgage-backed securities trade fostered by Fannie Mae and Freddie Mac. 

By 2007 alarms were going off everywhere. Nassim Taleb said interconnected global finance was the “black swan” that could herald a once-in-a-lifetime meltdown. Sheila Bair of the Federal Deposit Insurance Corporation raised red flags about subprime lending. CNBC’s Jim Cramer ranted (he likes to rant) that the Fed needed to wake up.

Then came the crash. In a Monitor cover story, Mark Trumbull revisits September ’08, interviews key players, and extracts five lessons from that harrowing – and still reverberating – event. 

The pre-2008 warning was clear. How could we have missed it? “Everybody missed it,” Greenspan observed in 2010, “academia, the Federal Reserve, all regulators.” Which was not quite accurate. Everybody who could have done something missed it.

The problem with prophets is that, by definition, they don’t have standing. They are marginal characters, voices crying in the wilderness. Most of the time they predict apocalypses that never take place and days of reckoning that never dawn. We chuckle and call them Chicken Little or Dr. Doom. But when we slam into a brick wall, as happened in ’08, we wonder why we didn’t listen.

We don’t listen because most of us are convinced that the bubbles we live in – bubbles of sunshine or gloom – will last forever. But they don’t. Change is always around the corner.

This is where prophets come in. They burst our mental bubbles before hard reality does. But because we can’t be sure who is worth heeding and who is just ranting, it is probably best to adopt the prudent policy of the biblical Joseph, who prophesied about the future of the economy he was managing: Realize you are always in a bubble. And prepare for what comes next. 

John Yemma is editor of the Monitor. He can be reached at editor@csmonitor.com.

Then-Monitor Latin America correspondent Sara Miller Llana (Now Europe-based) interviewed a farmer in Tamaula, Mexico, last year. (Melanie Stetson Freeman/Staff/File)

Why we should listen to the world

By Editor / 09.03.13

Is Hillary Rodham Clinton running? What about Ted Cruz, Chris Christie, Rand Paul? The 2016 presidential race is already generating buzz. Meanwhile, can you (without turning to Google) name the prime minister of Sweden? The president of Indonesia?

Readers outside the United States occasionally nudge us to remember that America is not the center of the world. Imagine how it seems to someone in Canada, Japan, Australia, Germany, the Philippines, or any of the 190 other countries to face a steady stream of news about American controversies, concerns, Kardashians, and political races that are more than three years away. 

We get it. The Monitor’s audience from the moment of its founding in 1908 was “all mankind.” Ideally, we would publish in many languages and give equal weight to news from all parts of the world. Practically, we’re a long way from that. Our American roots are real. We are US-based, and staffed largely by Americans. Most of our readers are in the US. And for better or worse, the US is by far the world’s biggest newsmaker. 

To compensate, we strive for international perspective – and have done so since our founding. It is why we have correspondents that we turn to all over the globe. In a Monitor cover story, we take you around the world in search of the best ideas in education. If you’ve read the Monitor for a while, you’ll recall that we’ve applied this international perspective to issues ranging from the fight against terrorism to the fostering of innovation, from competing systems for national health care to global manufacturing and trade.

But I’ll be the first to admit that there’s a parochial angle to our international perspective. Education: The American school year is beginning. Terrorism: The Boston Marathon bombings. Health care: “Obamacare.” Supply chains: America’s voracious consumers. (As author Bill Bryson puts it: “The whole of the global economy is based on supplying the cravings of two percent of the world’s population.”) If there’s no such thing as pure international perspective – no place to stand on the planet that is not, at heart, local – the striving for international perspective is undoubtedly healthy. It broadens our thinking. The 50 states of the United States of America are sometimes called “laboratories of democracy.” Different ideas are tried in these local jurisdictions. Things that seem good go national. Things that don’t stay local. The world’s 195 nations are laboratories, too. 

*           *           *

As summer winds down (OK, as Northern Hemisphere summer winds down), most of us – no matter how far removed from our school days we are – feel that mixture of dread and excitement. Leaves are still green, but the chlorophyll factories are shutting down. The new generation of ducks and geese are airborne. Winter is not far away. 

That’s the dread. The excitement is fall – dramatic, mellow, bracing, gentle. Fall doesn’t have the promise of arrival that spring has. It has the bittersweetness of departure. The poet William Cullen Bryant called it “the year’s last, loveliest smile.”

A few years ago, we asked readers to send us their fall foliage photos. We saw gorgeous images from New England, of course, but because Monitor readers are far-flung, we saw photos from the Pacific Northwest, Europe, and China. The best perspective, though, came from New Zealand. Buds were opening, blades were sprouting. It was spring there. 

John Yemma is editor of the Monitor. You can reach him at editor@csmonitor.com.

A virginia family hosts a young boy from Harlem, N.y. as part of the Fresh Air Fund offered to urban youth. (Reza A. Marvashti/The Free Lance-Star/AP)

MLK's dream is the American dream

By Editor / 08.24.13

People change their minds for a million reasons. Sometimes a million people change their minds for one reason. Let’s go back to Aug. 28, 1963 – midway through one of the most tear-stained years in American history. It was the year fire hoses and police dogs were used against civil rights marchers, when Medgar Evers was murdered in Jackson, Miss., and four little girls were killed by a bomb blast at the 16th Street Baptist Church in Birmingham, Ala. Later that year, President Kennedy was assassinated.

Against that backdrop, hundreds of thousands of people descended on Washington, D.C. They were peaceful. They were polite. They were insistent about what had to happen. The centerpiece of the March on Washington was a speech like no other.

In 17 minutes, Martin Luther King Jr. swept through American history, recalling the broken promise of equality for all, “the fierce urgency of now” in gaining civil rights, and the unstoppable power of “meeting physical force with soul force.” His voice strengthened and his cadences built as he progressed through the refrain of “Let freedom ring!” to the now sacred peroration: “I have a dream” – of reconciliation, brotherhood, and colorblindness but most of all of an America living up to the true meaning of its creed that “all men are created equal.”

That late August day 50 years ago was a tipping point in history. Any honest observer had to acknowledge the moral imperative of racial equality. King’s dream was an inarguable vision for what America should be. Millions changed their minds. Within a year, the Civil Rights Act was law. Public spaces and workplaces changed. Discrimination was outlawed.

In a Monitor cover story, Carmen Sisson measures where racial equality stands in 2013. Progress has been indisputable. But if the era of stark injustice is a distant memory, many civil rights workers say subtle racism persists. King’s dream has become reality in some ways but remains a dream in other ways.

That squares with the view of another longtime observer of race relations (and an old friend and colleague). Wil Haygood has written about racial issues throughout his journalism career. His mother is from Selma, Ala. As a young reporter in Pittsburgh, he paid his own Greyhound bus fare to Washington, D.C., in 1983 to witness a commemoration of the 20th anniversary of the “I Have a Dream” speech. 

Yes, racism persists, Wil says. But echoing America’s first black president in the wake of the verdict in the shooting death of Trayvon Martin, he also says there is no doubt that “the nation has moved further in front than retreated.” In a new book, “The Butler: A Witness to History,” Wil tells the life story of Eugene Allen, a black man of quiet dignity who joined the White House staff as a “pantry man” in 1952 and rose to White House butler, serving eight presidents. (A movie based on Mr. Allen’s life and starring Forest Whitaker and Oprah Winfrey has just been released.) Allen’s vantage provides a unique window on the history of the past 50 years.

One measure of how far we’ve traveled: “When Mr. Allen went to work at the White House,” Wil says, “he would go home to Virginia and have to use segregated facilities. Look at that – and then look at the astonishment of November 2008.”

If King’s dream is not fully realized, if it is still in part a dream, at least now it is the American dream.

 John Yemma is editor of the Monitor. He can be reached at editor@csmonitor.com 

Commuters (and data sources) moved through lower Manhattan last spring. (Melanie Stetson Freeman/Staff/File)

What we do, what they know

By Editor / 08.11.13

The footprints and arrowheads left by Stone Age ancestors are data from which archaeologists piece together the prehistoric world. That was little data. Digital Age humans generate big data. 

IBM estimates that 90 percent of the data in the world has been created in the past two years alone. The data flows from tweets, GPS signals, online searches, security cameras, and on and on. When all that data is vacuumed up and analyzed, it can produce insights into everything from retail marketing to crime fighting, electricity management to public health. In a Monitor cover story, Robert Lehrman delves into the benefits and costs of Big Data.

Along with the efficiencies and clever new applications that Big Data has yielded come big concerns about privacy. As science historian George Dyson noted in a recent article published in Edge.org, “If Google has taught us anything, it is that if you simply capture enough links, over time, you can establish meaning, follow ideas, and reconstruct someone’s thoughts. It is only a short step from suggesting what a target may be thinking now, to suggesting what that target may be thinking next.”

Even if you scrub all the cookies from your browser, ditch your cellphone, steer clear of social media, microwave your modem, and relocate to Walden Pond – just by being an earthling you’ll still leave a data trail. You’ll need to shop for food – or at least for seed to grow your own. Security cameras will see you, and the cash register will record your purchase. Selling any of that produce to buy shoes? Unless you’re a scofflaw, you have to pay taxes (more data). And you’re not going to stop phoning Mom and Dad, are you? Even a pay phone generates a call record.

Few people opt for the hermit lifestyle. Cellphones, computers, credit cards, and other conveniences are useful, even essential. So most of us make a mental bargain. We assume there’s a data trail and that for the most part it is nothing to worry about. Those security cameras deter crime. Those cookie-generated behavioral ads on the Internet may seem a little too familiar at times, but we’re adept at tuning out ads.

Even as Edward Snowden’s revelations of the scope of spying by the National Security Agency have boosted Americans’ concerns about privacy, according to recent opinion polls, there has not been a groundswell against the practice – perhaps because of continued concern about potential terrorism, perhaps a sense that only bad guys need worry.

But history shows that intelligence assets aimed at foreign threats can be employed domestically (see Cointelpro, Watergate, post-2001 warrantless surveillance – and far more egregious examples in other countries). Nor is it hard to imagine a mid-level employee in a government agency or private company (e.g., Mr. Snowden or Pfc. Bradley Manning) snooping out of curiosity or as a self-appointed whistle-blower. And ongoing phishing, spamming, and hacking problems on the Internet are a reminder that data hijackers are plentiful.

Here’s an easy prediction: Big Data is only going to get bigger. Every year, more sensors will produce more signals that will be more quickly analyzed. This will lead to more convenience. And more concern. Mr. Dyson – whose physicist father, Freeman Dyson, grappled with wondrous but fraught technologies such as nuclear energy – sums up the Big Data revolution this way: “Yes, we need big data, and big algorithms – but beware.”

The paradox of mental illness – the inability of the individual and those nearby to understand it and of doctors to treat it – makes it a problem with no simple solution. (Adrees Latif/Reuters)

Rethinking mental health care

By Editor / 08.04.13

Mental illness is a riddle within an enigma: A person dealing with it can be unaware something is wrong, unable to describe the problem, incapable of following a course of treatment, and ashamed of the stigma that accompanies it. Often, people in this state retreat into their own world. The writer Sylvia Plath recalled that telling someone about the depression she was experiencing was “so involved and wearisome that I didn’t say anything. I only burrowed down further in the bed.”

Those who live with, care for, or come into contact with a person in the grip of mental illness can be confused as well, not knowing how to help or when or if to intervene. In an earlier age, people considered mental illness to be demonic possession. The modern medical approach has oscillated between environment and heredity in trying to explain it and has employed everything from therapeutic conversation to isolation wards, powerful psychotropic drugs to disturbing operations in an attempt to cure it.

As late as the 1990s, doctors were obtaining permission from mental patients (by definition, this was not “informed consent”) to conduct experiments in which their medication was drastically altered so that researchers could observe acute episodes of the disease. Doctors said this was the only way to understand psychosis, since a patient couldn’t describe what was going on. Patients were, in effect, being treated like human guinea pigs.

Until the late 20th century, society “solved” the problem of mental illness by forcing those dealing with it into asylums and clinics. Those who had been committed – some with severe problems but others who were merely eccentric or occasionally troublesome – were out of sight and out of mind. And while people joked about the “funny farm” and “loony bin,” the dire conditions inside asylums, when finally exposed by journalists and reformers, shocked polite society.

All of which made deinstitutionalization seem progressive when it began in the 1960s. But outpatient treatment of the mentally ill has been largely inadequate and underfunded over the years, leaving families, friends, and individuals with mental issues to shift for themselves. Some eke out productive lives. Some live in the shadows. From time to time, a very few have fateful encounters with the outside world. Mass murders in Aurora, Colo.; Newtown, Conn.; and other places have raised new questions about whether enough is being done to help the mentally ill and to spot those who are potentially violent.

The paradox of mental illness – the inability of the individual and those nearby to understand it and of doctors to treat it – makes it a problem with no simple solution. In a Monitor Weekly cover story (click here for the story and here to subscribe to the Weekly), Amanda Paulson spotlights promising new programs aimed at coaxing those experiencing mental problems into programs that gently support them and foster their reintegration into society. That seems to help in some cases. More aggressive intervention may be still needed in other cases. 

If society has long struggled to figure out how to help the mentally ill, at least it has moved on from believing that anyone acting odd should be locked away. We seem to be moving beyond the laissez faire approach as well. New attention to mental illness is bringing it into the open – and that is giving rise to new ideas for dignified treatment. This isn’t the kind of problem that has ever gone out easily, but thoughtfulness, patience, and hope help. 

A farmer’s daughter plays in crops used for cattle feed in Awlad Yehia, Egypt. (Ann Hermes/Staff)

Tolerance: The Nile's age-old lesson

By Editor / 07.28.13

Civilization was born on the banks of rivers. The Indus, Yellow, Tigris, Euphrates, and Nile valleys nurtured agriculture, engineering, astronomy, trade, and generation after generation whose unrecorded lives form the strata of today’s world. Riverine cultures had to work out a basic social problem: ensuring that people upstream were fair to people downstream. Tolerance, even if it had to be enforced by the state, was the key.

Cities now thrive far from water sources. The networks of aqueducts and mains that feed our homes, factories, and offices are man-made rivers that most people scarcely notice. But Egypt is still directly connected to its alluvial past. The Nile remains as crucial to daily life as it did millenniums ago. 

Ninety percent of Egyptians live along its banks. Winding through parched geography like the stem of a giant sunflower, the Nile made – and still makes – Egypt possible. Nowhere is that more evident than at the Nile’s First Cataract at Aswan. Turn your back on the sparkling river and its green and welcoming banks and all you see are sandy hills rolling toward a hazy blue horizon. Face the river and you see the temples that prove the depth of Egyptian history and its intimate relationship with the Nile.

Ancient Egypt lasted more than 3,000 years – far longer than the world we call modern. That was plenty of time to develop an elaborate culture and a system for continuous social stability. For most of that long history, the people of the Nile have worked out their problems peacefully, although in the background, from the days of the Pharaoh until the overthrow of Hosni Mubarak, a powerful military-backed establishment – an entity today’s Egyptians call “the deep state” – has ensured order.

In a Monitor cover story (click here), Kristen Chick travels from Aswan north through Upper Egypt, taking the measure of an important but often overlooked section of a nation in the midst of a profound civil crisis. Since the Tahrir Square revolution of 2011, tourism has collapsed, lawlessness has soared, sectarian conflict has worsened, apportionment of water and other vital resources has broken down, and Egyptians have been losing faith in their country and each other.

Whether it was right or wrong for the military to oust Egypt’s democratically elected president, the fracturing of society that Kristen documents explains why so many Egyptians either supported the takeover or remained silent. The country’s deposed president, Mohamed Morsi, was backed by the Muslim Brotherhood, but much of Egypt’s overwhelmingly Muslim population wearied of Mr. Morsi’s Islamization project as law and order fell apart. As Mohamed ElBaradei, the Nobel laureate and former head of the International Atomic Energy Agency, wrote in the journal Foreign Policy: “You can’t eat sharia.” 

Two weeks after that article was published, Morsi was deposed. Mr. ElBaradei, now Egypt’s interim vice president, almost certainly was involved in the anti-Morsi coup. He and others tied to the deep state now have an exceptionally difficult job. They must stabilize Egypt without returning it to the repressive, military-controlled rule that preceded the revolution.

The Egyptians you’ll meet in Kristen’s journey are Muslims and Christians, farmers and tour guides, fundamentalists and secularists. Despite differences of class, politics, and religion, they drink from the same ancient river. Without tolerance, they know, Egypt would not exist – and will not continue. 

John Yemma is editor of the Monitor. He can be reached at editor@csmonitor.com.

Middle-schooLers in Eugene, Ore., performed at an Ellis Island immigration simulation last year (Chris Pietsch/The Register-Guard/AP)

The making of Americans

By Editor / 07.07.13

All Americans are immigrants. Some arrived ages before there were visas and borders or even countries; most came after. Some arrived against their will; most arrived hungry for what lay ahead. As recounted in thousands of immigrant stories, the first days in the New World could be glorious, dizzying, and upsetting. Opportunity was abundant and freedom exhilarating. But language, laws, and customs could be puzzling. Natives could be brusque. Work could be tedious and dangerous.

When the speed and excess got to be too much, there was always a sanctuary of fellow immigrants, where faith, food, and conversation were familiar. From the outside, Little Italy, Chinatown, and every other ethnic neighborhood could seem strange, even threatening. In the early 20th century, Anglo-Americans worried that immigrants from southern and eastern Europe weren’t fitting in. They were creating separate cultures and threatening the status quo. This was not just paranoia. Anarchists and labor activists, many rooted in immigrant communities, challenged the power structure. Criminal groups operated out of ethnic communities. IQ tests appeared to show a gap between native- and foreign-born.

But earlier immigrants had also kept to themselves (Germans in Pennsylvania, Swedes in Minnesota), challenged the power structure (1776 for example), suffered their share of criminality (the Bowery Boys of the 1840s, the outlaws of the West), and were considered less intelligent, motivated, and hygienic than those who arrived before them. 

All the while, however, the assimilation engine was running. Music, manners, and food were sampled – gingerly at first, then creatively. Tacos with Vietnamese hot sauce? Why not? Accents altered, friendships kindled, rings were exchanged. It was not always smooth, but year by year families blended, neighborhoods integrated, new citizens voted, and the nation evolved.

As Congress considers legislation that could grant citizenship to millions of people, the question hanging in the air is whether the assimilation engine still works. Scholars such as the late Samuel Huntington of Harvard University and commentators such as Pat Buchanan have warned that the influx of Latin Americans risks dividing the country into two societies. Census data and social-science research – measuring everything from educational achievement to homeownership to intermarriage – say otherwise. 

As Stephanie Hanes’s report shows (click here), the process of assimilation is far from straightforward, especially among first-generation immigrants. Most flourish, some don’t – just like the native born.

“So why is it that some residents in some states with large new immigrant populations believe that integration is not occurring?” asked a 2010 report by the Center for American Progress. “One reason is that new arrivals increased over a short period while assimilation, by definition, can only be observed over time.”

If all Americans are immigrants, we all have an immigrant story. My father’s parents, for instance, arrived from Italy in the early 20th century; my mother’s family was from Germany in the mid-19th century. Along the way, the name got changed. There is no “Y” in the Italian alphabet. So Yemma is an American name – as is Smith, Garcia, Yee, Shapiro, Shaloub, Nguyen, Patel, Obama, and every other name in the American phone book.

If I may speak for them: It isn’t always easy becoming an American, but it’s always good to be one.

John Yemma is editor of the Monitor. He can be reached at editor@csmonitor.com.

Chris Bull of Circle A Cycles in Providence, R.I., builds a made-to-order bicycle. (Alfredo Sosa/Staff)

It's the 'Bicycle Spring'

By / 07.01.13

Primitive tribes that could barely feed themselves put enormous effort into grandeur – monuments, fortifications, catapults – the bigger the better to impress allies and intimidate enemies. Modern nations build aircraft carriers and skyscrapers for the same reason. Humans have a thing about scale. It’s hard to ignore a cathedral, superhighway, jumbo jet, or Cadillac Escalade.

So let’s talk about the opposite. This week’s cover story is about an almost two-century-old contraption that isn’t at all formidable. A bike is thin and frail and awkward looking, even with a Tour de France athlete aboard. It is quintessentially human in scale. It holds one person (two if you are romantic, though I’ve seen four or more riders in developing countries) and converts muscle into locomotion more efficiently than any other vehicle. 

For a while in the late 19th century, bikes were the wonders of the age. Pre-automobile Henry Ford rode one. Pre-airplane Wright brothers built them. But for most of the 20th century two-wheelers retreated before the onslaught of increasingly impressive quadricycles. Bikes carved out a niche as kid’s toy, college necessity, and weekend amusement. Cars ruled. 

During the mid-1970s, I experimented for a week with the bicycle-only life in Dallas. It was fun but also harrowing, sweaty, and lonely. I was the freak on the streets. Pickup trucks and muscle cars were the norm and didn’t mind letting me know it. By Week 2, sheet-metal armor seemed like a wise move.

That was then. Now riding to and from work has slipped over the line from colorful and a bit odd to normal and, in some workplaces, expected. Vulnerable lone cyclists have grown into solid ranks of riders. What’s behind that? Fresh air, exercise, and, most important, zero emissions. Bikes are greener than a Prius or Tesla. The International Bicycle Fund calculates that an average person on a bicycle can travel three miles on the caloric energy of one egg. A person walking the same distance requires three eggs. A fully loaded bus burns the equivalent of two dozen eggs per person. A train ... well, we’re talking lots of eggs. 

As Ron Scherer and a team of Monitor correspondents show (click here) urban planners increasingly see bikes as an integral part of a transportation system. Cities are not just building bike lanes but facilitating bike sharing. The Centre for Advanced Spatial Analysis at University College London has a fascinating interactive map showing bike sharing worldwide (click here -- and a comprehensive map can be found here). While the sheer numbers of urban rental bikes are still only in the hundreds of thousands, there are bike-sharing programs from Taipei, Taiwan, to Fort Worth, Texas.

This is the Bicycle Spring. That is both plaudit and caution. As in other people-power movements, bicyclists have been so long oppressed by cars that they have a well-earned chip on their shoulder. They’ve had to endure lane swervers, door openers, hostile drivers. Every rider dreads potholes, slick roads, and unleashed dogs. Is it any wonder that some cyclists have gone militant? So here’s the caution: Use the newfound strength-in-numbers wisely. A stoplight applies to bikes, too. Weaving in and out of traffic isn’t fair or wise. Sidewalks and crosswalks are for pedestrians, who have their own issues with oppression.

We’ll get the hang of this. Bikes are no longer marginal enjoyments. They are in the mainstream and staying there.  

John Yemma is editor of the Monitor. He can be reached at editor@csmonitor.com. This article has been updated to include a link to a more comprehensive map of bicycle sharing programs worldwide.

Marchers call for the release of jailed US Army Pfc. Bradley Manning outside Fort Meade, MD. (Jonathan Ernst/Reuters)

Feeling for freedom's limits

By Editor / 06.23.13

People across the world stand in front of tanks, brave tear gas and rubber bullets, and sacrifice their lives for freedom. Freedom is among humanity’s deepest aspirations, a concept understood in every heart and revered in every society.

But what exactly is the measure of freedom?

In early 1941, President Franklin Roosevelt declared that a secure world rested on four essential human freedoms. Two were already enshrined in the US Constitution and familiar to generations of Americans: freedom of expression and worship. The other two were novel, even radical at the time. One was freedom from want, which Roosevelt described as the right of everyone to “a healthy peaceful life.” The other was freedom from fear, meaning that “no nation will be in a position to commit an act of physical aggression against any neighbor.”

FDR’s four freedoms are echoed in the preamble of the United Nations Universal Declaration of Human Rights. Four of Norman Rockwell’s most beloved paintings – the working-class guy standing to speak at a public meeting, worshipers’ heads bowed in prayer, a family gathered for Thanksgiving dinner, and parents tucking in their children while the dad holds a newspaper with the words “bombings” and “horror” in the headline – illustrate those four freedoms.

The struggle for freedom to and freedom from has propelled history for the past 72 years. It is behind virtually every news event. You can see it in the successive fights against fascism and communism. You can see it in the campaign for equal rights for African-Americans, women, and dozens of groups once excluded from full participation in self-government and the pursuit of happiness. You can see it in this week’s issue of the Monitor Weekly.

The quest for freedom from want has spurred worldwide progress against hunger, poverty, and disease. It explains, for instance, the massive mobilization against AIDS in Africa and other parts of the world as described by Jina Moore in a Monitor cover story. With the disease increasingly under control thanks to a sustained public health effort, Jina shows, the mothers, fathers, and children once crippled by HIV are increasingly free from fear. The newspapers they clutch no longer headline the horror of the disease.

Freedom from aggression, meanwhile, is at the heart of new questions about the US National Security Agency surveillance program. Terrorism is a very real public concern. But does national security require that every phone call and Internet click be saved? A Republican and a Democratic president – and a succession of members of Congress and a majority of the public as measured by current opinion polls – think so. But the revelation of the scope of the NSA’s data mining has touched off a national debate.

Absolute freedom is an ideal. But in the relative world of humanity, freedom’s extent and limits are always being reexamined and adjusted. Should all speech, including obscenity and hate speech, be free? Is there a point at which religious worship imposes on other people’s freedoms? Can a social safety net be maintained without fostering dependence or bankrupting the treasury? And where’s the line between security and liberty?

Asking and answering those questions is what we do in a free society. And after we decide, we’ll ask and answer again.

Members of a dance team presented by the Hindu Student Association prepared for Competition at a Bellaire, Texas, high school. (Eric Kayne/Special to the Christian Science Monitor )

Public schools, private beliefs

By Editor / 06.17.13

P

erception and reality are from different planets. Depending on the commentators you listen to or the news sources you read, for instance, you might believe that religion has been hounded out of American public schools. Alternatively, you might think that the only way people talk about religious differences is with aggressiveness, defensiveness, and misunderstanding.

The reality may surprise you. In her cover story, Lee Lawrence documents the many ways that religion, faith, and prayer are present in the public school systems. In the half century since the Supreme Court banned school prayer, Americans have traveled a path from believing there should be virtually no religious expression in schools to a place where the study of religion is increasingly accepted as part of a student’s normal acquisition of knowledge – and where prayer, discussion of faith, and religious training are as much an option in extracurricular activities as are the science club, pep squad, and 4-H.

In class, students are exploring the varieties of religious experience, the history of religion, and the need to understand other faiths in a diverse society and global economy. Kids do not, however, pray together during class – although nothing has ever prevented an individual from praying on his or her own, as long as it is not required or encouraged by teachers.

Separation of church and state remains a pillar of American life, clearly articulated in the First Amendment to the Constitution. But while the state must not back any particular faith – or even faith itself – it also must not restrict expressions of faith. The First Amendment says Congress “shall make no law respecting an establishment of religion,” and goes on to say there shall be no law “prohibiting the free exercise thereof.”

That two-part principle was fought for more than a century before the Constitution was written. In 1657, a group of non-Quakers in New Amsterdam stood up for the rights of Quakers at a time when the colony’s governor, Peter Stuyvesant, was trying to stamp out any religion other than the Dutch Reformed Church.

In what is known as the Flushing Remonstrance (Flushing is a neighborhood of what is now Queens), the colonists argued that just as Holland allowed religious freedom to “Jews, Turks, and Egyptians,” New Amsterdam should do so for “Presbyterian, Independent, Baptist, or Quaker.”  

The governor promptly jailed four of the signers and forced them to recant. More colonists stood up for freedom to worship; more arrests followed. By 1663, the Dutch West India Company had had enough, telling Stuyvesant to stand down because “The consciences of men at least ought ever to remain free and unshackled.”

The freedom that took root in New Amsterdam was that belief should not be imposed and believing should not be stifled. This is tricky, especially in the schools. Within a classroom, a teacher could cross the line and become a religious advocate. Peer pressure can try to force conformity. The way one looks and dresses, what one consumes, how observant a person is – these can easily cause misunderstanding, especially among the young.

Careful education about religion is important in a world where all types of believers and nonbelievers coexist. Freedom from imposed belief and freedom to believe dwell on the same planet – a planet where consciences must ever remain free and unshackled. 

John Yemma is editor of the Monitor. He can be reached at editor@csmonitor.com.

  • Weekly review of global news and ideas
  • Balanced, insightful and trustworthy
  • Subscribe in print or digital

Special Offer

 

Editors' picks:

Become a fan! Follow us! Google+ YouTube See our feeds!