Sometimes a portal opens onto the world of legend. A stone is rolled away from an Egyptian tomb revealing a 3,300-year-old Pharaoh’s power and wealth. A Roman city emerges virtually intact from volcanic ash, its dining tables set for dinner, its comfortable lifestyle interrupted by natural disaster. The mummified body of a Stone Age hunter emerges from a glacier in the Alps, and modern forensics determines from the metallurgy of his ax, his DNA, and the pollen on his clothes that he was the product of a surprisingly sophisticated culture.
With most archaeology, pottery shards and bone fragments provide sketchy evidence of unheralded lives. But even with the abundant material found at places like Pompeii, the stories we tell about lost worlds are speculative. New tools and theories always come along to challenge what we currently think we know.
Then there is the archaeological holy grail, which exists at the intersection of science and faith: the veracity of the biblical account. Bible archaeology fascinates Jews, Christians, and many Muslims, as well as historians and anyone who studies and cares about the Middle East or, for that matter, Western civilization. For centuries, believers and skeptics alike have wondered if Bible history was accurate, if facts underpinned belief or if it was sufficient to extract spiritual meaning from myth and metaphor.
Take the story of David. Was his a writer’s tale of youthful heroism, adult treachery, and the quest for redemption recorded in those sublime psalms? That could make it a Canaanite version of Homeric myth. But David’s words and deeds support the monotheistic brand, the argument that the one God should be “exalted among the nations.” How has his story come to be so influential if he was just a wordsmith, if he wasn’t perhaps a great king? So did his life unfold more or less as the Bible says?
In this a Monitor cover story, Christa Case Bryant takes us to an archaeological site southwest of Jerusalem where investigators have been sifting through what appears to be evidence of a Hebrew kingdom 3,000 years ago. What they’ve found (and haven’t found: no cultic figurines, no pig bones) might support the belief of a united kingdom under Saul, David, and Solomon that stretched from the Sinai to southern Lebanon and the Mediterranean Sea to beyond the Jordan River.
Even for secularists, that 10th-century BC kingdom is important. It is a part of the historical case for the Jewish return to the Holy Land. So you can see some of the implication of the dig at Khirbet Qeiyafa.
Few other cultures have as enduring a literary-historical tradition as that found in the Bible. But that doesn’t mean that individuals in other cultures didn’t experience the inspiration and drive for moral improvement that the Bible, at its best, advocates and chronicles. In almost every part of the world, we make our homes atop the remains of earlier people. The hopes and dramas, affections and beliefs of those past lives can also be winkled from the traces they leave in the strata.
All the stories we reconstruct require leaps of faith and a healthy regard for the provisional nature of what we know. But even as science helps us see more of our ancestry, we are unlikely to find hard evidence connecting human with divine. It takes a different kind of digging – using faith, not trowels – to arrive at what the psalmist called the “secret place of the most High.”
John Yemma is editor of the Monitor. He can be reached at email@example.com.
Have you noticed this pattern when dealing with a complex, intractable problem? You work through all the variables – some of which you control, most of which you don’t. You furrow your brow, break a dozen pencils, hit your head against multiple walls, and frequently drift into magical thinking about a breakthrough that wipes the problem out.
Even if the best minds of a generation keep at it, the problem persists. Then one day, you look around and realize that the problem is gone.
That’s the way the energy crisis seemed. After almost a century of abundant fossil fuel, supplies tightened in the early 1970s, prices soared, economies staggered, and that looked like the future as far as the eye could see. Jimmy Carter called the energy crisis the “moral equivalent of war.” Oh sure, maybe someone could dream up a breakthrough – the “cold fusion” device that Stanley Pons and Martin Fleischmann announced in 1989, for instance. But like the “Mr. Fusion” engine in the movie “Back to the Future,” that was fantasy. Year after year, the energy problem remained unsolved.
Have you looked around lately? You already know that hydraulic fracturing, while controversial, has revolutionized gas and oil extraction in the United States and other parts of the world. Solar arrays and wind turbines are popping up everywhere. There are promising new nuclear technologies under development. And a Monitor cover story, David Unger shows you an especially unheralded energy revolution that has crept up on us.
It sounds a little dull to call it by its traditional name: conservation. This is conservation with brains. Some call it the “enernet” revolution because it is driven by a combination of Internet, microelectronics, more-efficient devices, and fast feedback of real-time data. The result: dramatic savings in electricity and fossil fuel for businesses and consumers. And it is a trend that is only in its infancy. Tech, data, and energy management improve continuously.
This is big. And it is big precisely because it isn’t magical. It is the application of smart technology to everything from power grids to home thermostats. By networking these devices, data can be rapidly analyzed and energy precisely deployed. One vivid example: lights that come on in a warehouse just as a forklift is rolling by. Why light the whole building throughout the day?
The energy-efficiency revolution is the way most progressive revolutions unfold. There were no parting clouds and trumpet blasts. Millions of people applied themselves, swapped ideas, combined and recombined them, tested, measured, learned – and then rinsed and repeated. While some had noble motives, self-interest has done the heavy lifting: Big money is being saved through intelligent energy management.
So what other complex, intractable problems might be making imperceptible progress? Humanity has a long list: climate change; economic stagnation; disease of body and mind; religious intolerance; oppression based on race, sex, ethnicity, or place of origin. There’s no reason to expect such problems to be solved overnight. There’s every reason to keep at the problem solving – and to expect that one day we’ll look up and notice that today’s overwhelming worries have all but disappeared.
John Yemma is editor of the Monitor. He can be reached at editor@CSMonitor.com.
Here’s a trick anybody can do: Slip a sheaf of papers into a metal box in Anchorage, Alaska, or Oslo, Norway, and a few days later it will be in someone’s hands in Cape Town, South Africa, or Albuquerque, N.M.
OK, you’ll need a stamp, but what the postal service does is still something of a miracle. On foot, horseback, train, planes, and trucks, postal carriers have faithfully delivered the news around the world since the days of Cyrus the Great. Herodotus marveled about how undeterred these couriers were by snow, rain, heat, or darkness.
Paper needs to be hand-delivered. Information doesn’t, not in the Digital Age. And a news magazine is information.
Wait. I know what you’re thinking. The next sentence isn’t going to announce that the Monitor is leaving print behind. We’re committed to print. But we have a request: Take a look at our new Monitor Weekly digital edition. It’s free to Monitor print subscribers, it takes full advantage of digital delivery, and – yes, I’ll say it – it is faster and richer and more economical to produce than the print version.
If you have a tablet (an iPad running iOS 6 or later or an Android device running OS 4 or later), you can download an app from Apple’s App Store or Google Play. No tablet? Go to
CSMonitor.com/CSMApp, click “manage your account,” fill in the subscriber form, and then click on “current issue.” That’s not as smooth an experience as on an app, but it is still timely and accessible wherever you are. And here’s what the new app features:
•Fast access. At 5 a.m. on the Wednesday before the cover date (for the Sept. 30 issue, that would mean Sept. 25) a new Monitor Weekly is ready to read. That makes the publication more timely and more consistently available than if it arrived via mail. It makes the magazine available at the same time everywhere in the world.
•A new feature called The Daily Feed that keeps you updated on important Monitor news and features. This lets you read the weekly while staying in touch with current events.
•New navigation that gives you tips on browsing, shows you where you are in the magazine, lets you view pages and photos (much more vivid when backlighted) in portrait and landscape aspects, and makes it easier to move among sections and articles.
You can also bookmark articles, check out slide shows and movie trailers, follow links from our book reviews to buy a book, and e-mail a letter to the editor. And archives to past issues are only a tap away. We think this is a superior reading experience with enhancements that can’t be provided in print.
Whether you prefer the print or digital edition, you are supporting The Christian Science Monitor, with its unique mission to report clearly and compassionately on humanity’s sometimes difficult, often inspiring quest for freedom.
So, while we stand behind the print version of the Monitor Weekly, we hope you’ll agree that our digital app is a lively, timely, versatile way to read the Monitor Weekly. Wherever you are in the world, you can use the Monitor to stay current and go far.
War is an overused word. Real war – as opposed to “war” in the form of a football game, neighborhood spat, political debate, or public campaign against litter, illiteracy, poverty, or mosquitoes – is humanity at its worst. I’ve never fought in a war, but I have seen it up close on several occasions, most vividly in Lebanon in 1982. If civilization is the year-by-year building of families, communities, commerce, and art – the hum of life at cafes and shops, schools and playgrounds – then war is anti-civilization. It is a machine that hurts people and breaks things.
Because large portions of life now take place on the Internet, the introduction of war into cyberspace was perhaps inevitable. In a Monitor cover story, Anna Mulrine examines the Pentagon’s increasing emphasis on cyberwarfare. The number of specialists able to defend US computer networks – and thus the power grids, financial systems, and critical infrastructure of the nation – and also, if necessary, attack the networks of rivals, is programmed to increase fivefold by 2015.
As with most Defense Department projects, spending on cyberwar is ballooning under the “better safe than sorry” rubric. No one wants to be caught by surprise. And as with nuclear missiles, aircraft carriers, and drone fleets, the more impressive the cyberwarfare capabilities, the more they serve as a deterrent and persuader, even if never activated.
What’s not clear is whether the threat of cyberwarfare is still largely science fiction. Most hostile cyber-acts so far have been small and hard to pull off. The Stuxnet computer worm that infected Iranian centrifuges in 2010 was cleverly configured (probably by Israeli and American intelligence agencies). Stuxnet may have set Iran’s nuclear program back by a year or so, but it didn’t stop it. Iran in all likelihood now has taken countermeasures. Similarly, denial-of-service attacks that bring down websites by flooding them with traffic are a nuisance but hardly the stuff of shock and awe. The hacks carried out by groups such as Anonymous and the Syrian Electronic Army are more vandalism than acts of war.
Most experts see cyberwarfare circa 2013 as a niche capability. Referring to what is believed to have been an Israeli air raid on a Syrian nuclear facility in 2007, a recent RAND Corp. strategic study described cyber capabilities as “better suited to one-shot strikes (e.g., to silence a surface-to-air missile system and allow aircraft to destroy a nuclear facility under construction) than to long campaigns (e.g., to put constant pressure on a nation’s capital).”
Wars don’t always start dramatically, as World War II did for the United States on Dec. 7, 1941. They more often evolve, as they have in Syria since 2011, one bad deed leading to another. What we’re seeing with the new emphasis on cyberwarfare looks like the early days of air warfare in World War I – when fragile biplanes carried out showy, ineffective attacks. But as with any warmaking skill, cyberwar is likely to become more efficient and deadly in time. The US Cyber Command and the 24th Air Force have, in effect, added a fifth theater to the military’s land, sea, air, and space capabilities.
War is something that humans still have to learn – at the least, for self-defense and to halt injustice. But war, even in cyberspace, still has one age-old purpose: to hurt people and break things.
John Yemma is editor of the Monitor. He can be reached at firstname.lastname@example.org.
Forceful leaders are often known by only one name – Caesar, Napoleon, Stalin. They often pick up suffixes like “the great,” “the magnificent,” or “the conqueror.” They’ve got style. They’re bold, charismatic. They exult in glory, crave applause, and specialize in grand gestures. These guys (they’re usually guys) bestride the narrow world and change the course of history – though not always for the better.
In a Monitor cover story, Sara Miller Llana profiles a leader who is arguably the most powerful woman in the world. Germany’s Angela Merkel, as you’ll see, is cut from a different cloth. She is quiet, and she listens, works incrementally, and rarely makes a show of her leadership. A scientist by training, she grew up in the straitened society of East Germany.
The Germany she leads is the economic engine of Europe – which is making the Germany-dominated European Union, despite ongoing debt and austerity issues, a globally competitive organization of 500 million people.
Crucially, the woman who looks set to serve a third term as Germany’s chancellor is cautious. Like most Germans, she has a fundamental understanding of the problems of overreaching and the poison of personality. Like most East Germans, she knows that nations are not forever.
Those qualities of quiet leadership are the same ones distilled by Joseph Badaracco, a professor at Harvard Business School who has studied this form of leadership (see “Leading Quietly: An Unorthodox Guide to Doing the Right Thing”). Quiet leaders, he says, are known for their patience, care, and incrementalism. Albert Schweitzer is one example. Dr. Schweitzer believed that public action was overrated and that small and obscure deeds mattered most. The sum of them, he said, “is a thousand times stronger than the acts of those who receive wide public recognition.”
Even heroic figures, Professor Badaracco notes, often do an enormous amount of organizing and consensus-building that is overshadowed by their dramatic moment on the world stage. He suggests we not concentrate so much on great figures and riveting events. Instead, he says, pay attention to people who are working on complex tasks, who shun oversimplification, let others take credit, and understand that they must navigate uncertainty.
“The world is so complex and frustrating that it is natural that people want someone who cuts through all that,” Badaracco says. But while decisiveness is cool, carefulness can be better. Even delay can be a virtue if the time is used for intelligent analysis and consensus-building.
Boring, right? There aren’t a lot of epic poems or blockbuster movies about quiet leaders. But you know them when you see the organizations, businesses, and countries they lead – collections of creative individuals living and working at high levels of productivity, purpose, and contentment. Which is not an argument for laissez-faire leadership. There still has to be a decider and buck-stopper. A leaderless organization drifts. Pre-Napoleon France and Weimar Germany show the danger of drift. A quiet leader like Angela Merkel wants neither drama nor drift.
Quiet leaders are totally engaged even if they are almost invisible. To see them, look at the people they lead.
John Yemma is editor of the Monitor. You can reach him at email@example.com.
You can’t say we weren’t warned. In the months and years before the 2008 financial meltdown, economic prophets saw trouble ahead. At first they politely begged to differ with the conventional wisdom that good times would roll on and on. Eventually, they were shouting from the rooftops.
As early as 1998, Brooksley Born of the Commodity Futures Trading Commission cautioned that financial derivatives were out of control. Throughout the early 2000s, William White of the Bank for International Settlements repeatedly advised central bankers (most directly in 2003 with Federal Reserve Chairman Alan Greenspan in the audience) that lax monetary policies and poor oversight were generating dangerous asset bubbles.
In 2003, Robert Shiller cautioned that the housing market was irrationally exuberant. Two years later, Nouriel Roubini said a devastating real estate bust was coming. That same year, Raghuram Rajan of the International Monetary Fund raised his voice about risky financial innovations. By then even Mr. Greenspan, the architect of the easy-money policies widely associated with the crash, was telling Congress he was worried, but his worry was the massive mortgage-backed securities trade fostered by Fannie Mae and Freddie Mac.
By 2007 alarms were going off everywhere. Nassim Taleb said interconnected global finance was the “black swan” that could herald a once-in-a-lifetime meltdown. Sheila Bair of the Federal Deposit Insurance Corporation raised red flags about subprime lending. CNBC’s Jim Cramer ranted (he likes to rant) that the Fed needed to wake up.
Then came the crash. In a Monitor cover story, Mark Trumbull revisits September ’08, interviews key players, and extracts five lessons from that harrowing – and still reverberating – event.
The pre-2008 warning was clear. How could we have missed it? “Everybody missed it,” Greenspan observed in 2010, “academia, the Federal Reserve, all regulators.” Which was not quite accurate. Everybody who could have done something missed it.
The problem with prophets is that, by definition, they don’t have standing. They are marginal characters, voices crying in the wilderness. Most of the time they predict apocalypses that never take place and days of reckoning that never dawn. We chuckle and call them Chicken Little or Dr. Doom. But when we slam into a brick wall, as happened in ’08, we wonder why we didn’t listen.
We don’t listen because most of us are convinced that the bubbles we live in – bubbles of sunshine or gloom – will last forever. But they don’t. Change is always around the corner.
This is where prophets come in. They burst our mental bubbles before hard reality does. But because we can’t be sure who is worth heeding and who is just ranting, it is probably best to adopt the prudent policy of the biblical Joseph, who prophesied about the future of the economy he was managing: Realize you are always in a bubble. And prepare for what comes next.
John Yemma is editor of the Monitor. He can be reached at firstname.lastname@example.org.
Is Hillary Rodham Clinton running? What about Ted Cruz, Chris Christie, Rand Paul? The 2016 presidential race is already generating buzz. Meanwhile, can you (without turning to Google) name the prime minister of Sweden? The president of Indonesia?
Readers outside the United States occasionally nudge us to remember that America is not the center of the world. Imagine how it seems to someone in Canada, Japan, Australia, Germany, the Philippines, or any of the 190 other countries to face a steady stream of news about American controversies, concerns, Kardashians, and political races that are more than three years away.
We get it. The Monitor’s audience from the moment of its founding in 1908 was “all mankind.” Ideally, we would publish in many languages and give equal weight to news from all parts of the world. Practically, we’re a long way from that. Our American roots are real. We are US-based, and staffed largely by Americans. Most of our readers are in the US. And for better or worse, the US is by far the world’s biggest newsmaker.
To compensate, we strive for international perspective – and have done so since our founding. It is why we have correspondents that we turn to all over the globe. In a Monitor cover story, we take you around the world in search of the best ideas in education. If you’ve read the Monitor for a while, you’ll recall that we’ve applied this international perspective to issues ranging from the fight against terrorism to the fostering of innovation, from competing systems for national health care to global manufacturing and trade.
But I’ll be the first to admit that there’s a parochial angle to our international perspective. Education: The American school year is beginning. Terrorism: The Boston Marathon bombings. Health care: “Obamacare.” Supply chains: America’s voracious consumers. (As author Bill Bryson puts it: “The whole of the global economy is based on supplying the cravings of two percent of the world’s population.”) If there’s no such thing as pure international perspective – no place to stand on the planet that is not, at heart, local – the striving for international perspective is undoubtedly healthy. It broadens our thinking. The 50 states of the United States of America are sometimes called “laboratories of democracy.” Different ideas are tried in these local jurisdictions. Things that seem good go national. Things that don’t stay local. The world’s 195 nations are laboratories, too.
* * *
As summer winds down (OK, as Northern Hemisphere summer winds down), most of us – no matter how far removed from our school days we are – feel that mixture of dread and excitement. Leaves are still green, but the chlorophyll factories are shutting down. The new generation of ducks and geese are airborne. Winter is not far away.
That’s the dread. The excitement is fall – dramatic, mellow, bracing, gentle. Fall doesn’t have the promise of arrival that spring has. It has the bittersweetness of departure. The poet William Cullen Bryant called it “the year’s last, loveliest smile.”
A few years ago, we asked readers to send us their fall foliage photos. We saw gorgeous images from New England, of course, but because Monitor readers are far-flung, we saw photos from the Pacific Northwest, Europe, and China. The best perspective, though, came from New Zealand. Buds were opening, blades were sprouting. It was spring there.
John Yemma is editor of the Monitor. You can reach him at email@example.com.
People change their minds for a million reasons. Sometimes a million people change their minds for one reason. Let’s go back to Aug. 28, 1963 – midway through one of the most tear-stained years in American history. It was the year fire hoses and police dogs were used against civil rights marchers, when Medgar Evers was murdered in Jackson, Miss., and four little girls were killed by a bomb blast at the 16th Street Baptist Church in Birmingham, Ala. Later that year, President Kennedy was assassinated.
Against that backdrop, hundreds of thousands of people descended on Washington, D.C. They were peaceful. They were polite. They were insistent about what had to happen. The centerpiece of the March on Washington was a speech like no other.
In 17 minutes, Martin Luther King Jr. swept through American history, recalling the broken promise of equality for all, “the fierce urgency of now” in gaining civil rights, and the unstoppable power of “meeting physical force with soul force.” His voice strengthened and his cadences built as he progressed through the refrain of “Let freedom ring!” to the now sacred peroration: “I have a dream” – of reconciliation, brotherhood, and colorblindness but most of all of an America living up to the true meaning of its creed that “all men are created equal.”
That late August day 50 years ago was a tipping point in history. Any honest observer had to acknowledge the moral imperative of racial equality. King’s dream was an inarguable vision for what America should be. Millions changed their minds. Within a year, the Civil Rights Act was law. Public spaces and workplaces changed. Discrimination was outlawed.
In a Monitor cover story, Carmen Sisson measures where racial equality stands in 2013. Progress has been indisputable. But if the era of stark injustice is a distant memory, many civil rights workers say subtle racism persists. King’s dream has become reality in some ways but remains a dream in other ways.
That squares with the view of another longtime observer of race relations (and an old friend and colleague). Wil Haygood has written about racial issues throughout his journalism career. His mother is from Selma, Ala. As a young reporter in Pittsburgh, he paid his own Greyhound bus fare to Washington, D.C., in 1983 to witness a commemoration of the 20th anniversary of the “I Have a Dream” speech.
Yes, racism persists, Wil says. But echoing America’s first black president in the wake of the verdict in the shooting death of Trayvon Martin, he also says there is no doubt that “the nation has moved further in front than retreated.” In a new book, “The Butler: A Witness to History,” Wil tells the life story of Eugene Allen, a black man of quiet dignity who joined the White House staff as a “pantry man” in 1952 and rose to White House butler, serving eight presidents. (A movie based on Mr. Allen’s life and starring Forest Whitaker and Oprah Winfrey has just been released.) Allen’s vantage provides a unique window on the history of the past 50 years.
One measure of how far we’ve traveled: “When Mr. Allen went to work at the White House,” Wil says, “he would go home to Virginia and have to use segregated facilities. Look at that – and then look at the astonishment of November 2008.”
If King’s dream is not fully realized, if it is still in part a dream, at least now it is the American dream.
John Yemma is editor of the Monitor. He can be reached at firstname.lastname@example.org
The footprints and arrowheads left by Stone Age ancestors are data from which archaeologists piece together the prehistoric world. That was little data. Digital Age humans generate big data.
IBM estimates that 90 percent of the data in the world has been created in the past two years alone. The data flows from tweets, GPS signals, online searches, security cameras, and on and on. When all that data is vacuumed up and analyzed, it can produce insights into everything from retail marketing to crime fighting, electricity management to public health. In a Monitor cover story, Robert Lehrman delves into the benefits and costs of Big Data.
Along with the efficiencies and clever new applications that Big Data has yielded come big concerns about privacy. As science historian George Dyson noted in a recent article published in Edge.org, “If Google has taught us anything, it is that if you simply capture enough links, over time, you can establish meaning, follow ideas, and reconstruct someone’s thoughts. It is only a short step from suggesting what a target may be thinking now, to suggesting what that target may be thinking next.”
Even if you scrub all the cookies from your browser, ditch your cellphone, steer clear of social media, microwave your modem, and relocate to Walden Pond – just by being an earthling you’ll still leave a data trail. You’ll need to shop for food – or at least for seed to grow your own. Security cameras will see you, and the cash register will record your purchase. Selling any of that produce to buy shoes? Unless you’re a scofflaw, you have to pay taxes (more data). And you’re not going to stop phoning Mom and Dad, are you? Even a pay phone generates a call record.
Few people opt for the hermit lifestyle. Cellphones, computers, credit cards, and other conveniences are useful, even essential. So most of us make a mental bargain. We assume there’s a data trail and that for the most part it is nothing to worry about. Those security cameras deter crime. Those cookie-generated behavioral ads on the Internet may seem a little too familiar at times, but we’re adept at tuning out ads.
Even as Edward Snowden’s revelations of the scope of spying by the National Security Agency have boosted Americans’ concerns about privacy, according to recent opinion polls, there has not been a groundswell against the practice – perhaps because of continued concern about potential terrorism, perhaps a sense that only bad guys need worry.
But history shows that intelligence assets aimed at foreign threats can be employed domestically (see Cointelpro, Watergate, post-2001 warrantless surveillance – and far more egregious examples in other countries). Nor is it hard to imagine a mid-level employee in a government agency or private company (e.g., Mr. Snowden or Pfc. Bradley Manning) snooping out of curiosity or as a self-appointed whistle-blower. And ongoing phishing, spamming, and hacking problems on the Internet are a reminder that data hijackers are plentiful.
Here’s an easy prediction: Big Data is only going to get bigger. Every year, more sensors will produce more signals that will be more quickly analyzed. This will lead to more convenience. And more concern. Mr. Dyson – whose physicist father, Freeman Dyson, grappled with wondrous but fraught technologies such as nuclear energy – sums up the Big Data revolution this way: “Yes, we need big data, and big algorithms – but beware.”
Mental illness is a riddle within an enigma: A person dealing with it can be unaware something is wrong, unable to describe the problem, incapable of following a course of treatment, and ashamed of the stigma that accompanies it. Often, people in this state retreat into their own world. The writer Sylvia Plath recalled that telling someone about the depression she was experiencing was “so involved and wearisome that I didn’t say anything. I only burrowed down further in the bed.”
Those who live with, care for, or come into contact with a person in the grip of mental illness can be confused as well, not knowing how to help or when or if to intervene. In an earlier age, people considered mental illness to be demonic possession. The modern medical approach has oscillated between environment and heredity in trying to explain it and has employed everything from therapeutic conversation to isolation wards, powerful psychotropic drugs to disturbing operations in an attempt to cure it.
As late as the 1990s, doctors were obtaining permission from mental patients (by definition, this was not “informed consent”) to conduct experiments in which their medication was drastically altered so that researchers could observe acute episodes of the disease. Doctors said this was the only way to understand psychosis, since a patient couldn’t describe what was going on. Patients were, in effect, being treated like human guinea pigs.
Until the late 20th century, society “solved” the problem of mental illness by forcing those dealing with it into asylums and clinics. Those who had been committed – some with severe problems but others who were merely eccentric or occasionally troublesome – were out of sight and out of mind. And while people joked about the “funny farm” and “loony bin,” the dire conditions inside asylums, when finally exposed by journalists and reformers, shocked polite society.
All of which made deinstitutionalization seem progressive when it began in the 1960s. But outpatient treatment of the mentally ill has been largely inadequate and underfunded over the years, leaving families, friends, and individuals with mental issues to shift for themselves. Some eke out productive lives. Some live in the shadows. From time to time, a very few have fateful encounters with the outside world. Mass murders in Aurora, Colo.; Newtown, Conn.; and other places have raised new questions about whether enough is being done to help the mentally ill and to spot those who are potentially violent.
The paradox of mental illness – the inability of the individual and those nearby to understand it and of doctors to treat it – makes it a problem with no simple solution. In a Monitor Weekly cover story (click here for the story and here to subscribe to the Weekly), Amanda Paulson spotlights promising new programs aimed at coaxing those experiencing mental problems into programs that gently support them and foster their reintegration into society. That seems to help in some cases. More aggressive intervention may be still needed in other cases.
If society has long struggled to figure out how to help the mentally ill, at least it has moved on from believing that anyone acting odd should be locked away. We seem to be moving beyond the laissez faire approach as well. New attention to mental illness is bringing it into the open – and that is giving rise to new ideas for dignified treatment. This isn’t the kind of problem that has ever gone out easily, but thoughtfulness, patience, and hope help.