It is a truth universally acknowledged that a happily married man writing about courtship must be in need of a serious talking to. Nothing so lacks credibility, especially with younger readers. Oh, sure, he may on occasion text his wife a sweet nothing using “r” as a verb to show how with-it he is. He may throw caution to the wind and attempt to dance “the robot” at a nephew’s wedding. But he almost always will put air quotes around such not-really-modern-anymore fads to signal his actual distance from them.
For those reasons and more, it is probably best for me to send you directly to Eilene Zimmerman's fascinating and thorough exploration of modern dating.
Depending on your vintage and values, your view of 2012 courtship may range from quiet approval to mild alarm. There is, for instance, a definite standoffishness about marriage in the Millennial Generation (also known as Generation Y, meaning those born in the 1980s and early ’90s). But even if young people are waiting longer before saying their vows than any previous generation, they aren’t necessarily anti-marriage. What they most want, it appears, is to get marriage right.
“They’ve seen a lot of divorce in their parents’ generation,” Eilene told me the other day. “They’ve been through difficult holidays. But they’re really into family and marriage. They actually want to move to the suburbs and raise a family one day – just not now.”
Many have struggled through a bleak job market while carrying big college debt, so they are naturally cautious. Even the dreaded subject of sex is not what you may think in an age of ubiquitous contraception and noncommittal “hookups”: Despite a casual attitude toward intimacy, risky behavior is not something this well-warned generation embraces. In some ways, says Eilene, Gen-Y is like the famed GI generation that fought in World War II and built the postwar world. By midcentury, in other words, this may be a fairly conservative cohort.
Eilene is an Gen-Xer. I’m a baby boomer. Talking about generations is unavoidable in trying to understand society. The drawback, of course, is that as with every other way humans categorize themselves – race, gender, religion, class – there are tremendous variations within the categories. Music, dress, slang, and hairstyles may characterize an age group, but those are superficialities. Individuals within Gen-Y are carving out distinct paths when it comes to tricky issues like intimate relations. That was true even of us boomers, not all of whom went to San Francisco with flowers in our hair.
In the not too distant future, Millennials themselves may wonder about the dating scene for the next demographic cohort, Generation Z (aka, Digital Natives). They, too, will have to use air quotes when it comes to current mores.
Society is a large and interesting blend of ages, communities, families, and individuals. But when it comes to affairs of the heart, most people are looking for the same thing: deep and abiding commitment. The StoryCorps project has assembled a collection of long-running love stories in a book titled “All There Is.” The collection echoes the testimonials from elderly couples that were sprinkled throughout the 1989 date movie “When Harry Met Sally” – how we met, what we mean to each other, the relationship two people build throughout a lifetime. One of the "All There Is" couples mentioned six nice things that one couple learned to say to each other throughout their marriage: You look great. Can I help? Let’s eat out. I was wrong. I am sorry. I love you.
From neolithic times to the digital age, those have been words to live by.
So, Honey, if u r reading this, u look great. Happy Valentine’s Day. Let’s eat out tonight. I promise not to dance the robot.
John Yemma is the editor of The Christian Science Monitor.
Who doesn’t love an invention? Think of the light bulb, which had never shined in all of history until Thomas Edison switched it on on Dec. 31, 1879. Think of lasers, helicopters, microchips, elegant equations like E = mc², and even modest wonders like batteries, Velcro, and air conditioning. We honor inventors, enrich them, ask them about the meaning of life.
Reinvention isn’t in that league. The tip-off is the “re,” meaning it’s been done before. If invention is the dazzling hit, reinvention usually begins as a miss. But a miss that is taken back into the workshop, rethought, reworked, and brought out for a second, third, or fourth try can change the world. The light bulb, for instance, only stayed lit after it hadn’t at least 6,000 times. Each time, Edison and his co-workers took stock of what went wrong, made improvements, and tried again. His invention was a serial reinvention.
For many workers in the wake of the 2007-09 recession, reinvention has not been by choice. Nor has it been easy. But here’s what re-invention always is: necessary.
For one thing, economic survival depends on reinvention. Job security has all but vanished in most industries. Owning a home or shoveling money into a 401(k) is not the path to financial security it once was believed to be. That’s bad news. The good news is that so many people have experienced financial setbacks, layoffs, or job shifts in recent years that it is clear that failure is not about character flaws.
Failure can be an opportunity. Saying that, of course, is easier than living through it, especially if you’re out of work, in debt, or clinging to a job you don’t like because you need the money. But reinvented careers can be the ones in which people learn more about who they really are and make a go of something they really love. (You can see evidence of that in this special report in The Christian Science Monitor Weekly news magazine.)
The key seems to be to stop mourning the past, honestly assess your skills, envision where you want to be, gain new skills, and then go for it. You may not end up where you think you should, but you will be learning with every step.
As Abraham Maslow, who developed the famed “hierarchy of needs” – a pyramid that everyone climbs from basic requirements like food and water to an apex of creativity and achievement – put it: “One’s only failure is failing to live up to one’s own possibilities.”
Besides, people who have seemed touched by greatness still have lives to live. Like athletes and musicians, most inventors are highly productive in their youth. At 25, Albert Einstein experienced a “year of wonders.” Working as a patent clerk, he achieved conceptual breakthroughs that revolutionized modern physics. Einstein led a long and useful life, and some of his most profound discoveries happened long after he was 25. But the sum of his brilliance during that one year was never repeated.
I once asked an octogenarian physicist about his year of wonders. Among other things Martin Deutsch, who taught at the Massachusetts Institute of Technology, had discovered positronium, an elemental form of matter. “In 1951 and ’52, I had a great creative outburst that I’ve never had again,” he told me. “It was a virtuoso performance.”
His colleagues thought he would win the Nobel Prize. He was not bitter that he did not. Instead he reinvented himself as a teacher and was satisfied that “there are people who have become what they have become because of what I have taught them.”
At the end of our conversation, he pointed out a spruce tree in the corner of his garden. He had rescued it when his neighbor threw out a bunch of potted plants. Day after day it had sat there, refusing to turn brown. “It wanted to live,” he said.
Martin Deutsch died in 2002. His students are pushing physics forward in the 21st century. And that spruce, which didn’t make it as a potted plant, flourished.
John Yemma is the editor of The Christian Science Monitor. To comment on this column or anything else in the Monitor, please e-mail email@example.com.
Solar, wind, hydro, and geothermal are widely considered benign energy sources. For the most part, they are. They harness nature without producing noxious emissions or significant waste streams. They don’t require strip mining, punching a hole in the seabed, fracturing bedrock, or splitting atoms. From sailboats to south-facing gardens, hot springs to millstreams, green energy’s friendly reputation predates hydrocarbon and fission by centuries.
But in their modern application, even these ancient energy sources have downsides. Most photovoltaic cells, for instance, contain nitrogen trifluoride, which the Scripps Institution of Oceanography says is a potent greenhouse gas when it escapes into the air. Solar cells also block sunlight from grass and flowers that otherwise would bask in it. When you dam a river, you constrict fish migrations and deprive alluvial plains of nutrients. Geothermal often means power plants atop scenic areas. And wind, the subject of this week’s cover story, needs enormous wind turbines. Birds and bats fly into them. Noise and visual pollution can be annoying.
Wait. I know what you are thinking: Green energy drawbacks are tiny compared with Chernobyl, Fukushima, the Exxon Valdez, the BP oil spill, and global warming. Absolutely right. But part of the reason the drawbacks are minor is that green energy is still a fraction of overall energy production. A few windmills on the Zuider Zee are as charming as tulips and wooden shoes. But when you erect acres of wind turbines, you’ve got a scale problem.
Consider the world before the internal combustion engine. Today, most people consider the automobile a Faustian bargain, a huge convenience that is nevertheless blamed for altering our landscape and atmosphere. In its first years, however, the horseless carriage was not just a technological marvel but an answer to a significant crisis. As the urban population of people exploded in the 19th century, so did the urban horse population. The effect on the environment, public safety, and public health was awful and heading for catastrophic, writes Eric Morris in an excellent 2007 article (you can read it here) in the University of California’s Transportation Center’s Access magazine: “One New York prognosticator of the 1890s concluded that by 1930 the horse droppings would rise to Manhattan’s third-story windows. A public health and sanitation crisis of almost unimaginable dimensions loomed.” And horses, which could rear up or bolt for no apparent reason, were even more dangerous per thousand people than automobiles. They were also exploited mercilessly in the grim economics of 19th-century cities.
Black Beauty is a magnificent creature with an insignificant waste stream. Ten thousand are an environmental nightmare. Henry Ford helped solve that problem. But now we burn through so many hydrocarbons that we have a new environmental crisis.
Green energy is good energy. But it is not perfect. Think about wind turbines. Over the Christmas holidays, a 26-story-tall one popped up by the highway I take to work. It’s big – “War of the Worlds” big. That single turbine is a novelty. An army of them can become a huge controversy, as has happened off the picturesque south coast of Cape Cod, where a massive wind farm on Nantucket Sound is inching forward amid intense local opposition.
Now wind energy is exploding across the globe. In areas such as Mexico’s Isthmus of Tehuantapec, questions of exploitative development have accompanied the boom. That is likely to be the case in many parts of the developing world, which has all too frequently been despoiled to feed the energy and raw-materials needs of industrial nations. NIMBY issues and indigenous resistance are bound to multiply as fast as windmills, solar farms, and other green energy installations as the world races to diversify away from hydrocarbons to protect the climate and at the same time accommodate both the 7 billion people now on the planet and the 3 billion more that are likely to arrive by 2070.
We solved the horse problem with horsepower. Now we have a horsepower problem. Solving it presents a new set of problems. There’s always a job out there for a new problem solver.
John Yemma is the editor of The Christian Science Monitor.
Only a Scrooge would frown on child’s play. Or to be more modern: only a Severus Snape. Sure, children can goof off for a while. But the age at which they are required to put away childish things, straighten up and fly right, and master the Hogwarts curriculum keeps getting younger and younger.
Standardized testing, helicopter parenting, a society obsessed with good colleges and successful careers – there are plenty of reasons why time for make-believe and play-acting has been shrinking. In a new Monitor special report, Stephanie Hanes looks at overprogrammed childhood and the educators, parents, psychologists, and others who are trying to reverse the tide.
Stephanie is a veteran correspondent who has had demanding assignments for the Monitor in sub-Saharan Africa and other parts of the world. Her interest in child’s play – like her interest in the “Disney princess effect” (see our Sept. 26, 2011, cover) – was sharpened by her introduction less than a year ago to an important new journalistic source: Madeline Thuli Hanes Wilson.
Through the eyes of her daughter, Maddy, Stephanie says she has been seeing how modern childhood is too often torqued by commercialism and parental anxiety. “I was looking at the books that I’m reading to her and realized that, wow, so many of them are selling products,” she says. That’s not unlike the rampant merchandise tie-ins to girlhood that Stephanie reported on earlier.
At 11 months, Maddy already has an extraordinary number of organized activities she can take part in. Her favorites? “When people are on the ground interacting with her,” her mom says.
Now, plenty of parents in developing countries would like the opportunity to expose their children to organized activities. And toys and games aren’t evil. They can make a kid feel enriched, boost skills, and familiarize youngsters with the technological world they are entering. But relentless scripting of child’s play has its drawbacks.
Free time and make-believe boost physical development, socialization, and – most important – the imagination. A huge amount of what we value as a civilization comes from the what-if side of us. While we must follow rules and recipes, train ourselves and test our skills, our artistic side needs time to wonder, improvise, and dream. Productive writers from Shakespeare to Charles Dickens, Dr. Seuss to J.K. Rowling, have coupled imagination with discipline. Wolfgang Amadeus Mozart talked of musical ideas emerging when he was alone, sometimes when he was sleepless or taking a walk after a good meal. To muse and mull and eventually hear a symphony in his head, he said, “is perhaps the best gift I have my Divine Maker to thank for.”
Mozart might have been the most overprogrammed child of the 18th century. Under his father’s tutelage, he was by the age of 5 adept at violin and keyboard, and composing and performing for European royalty.
By today’s standards, he would have been locked and loaded for the Juilliard School since he was in diapers.
It takes both imagination and discipline to produce works as original as “The Magic Flute.” That winning combination is true not just of literature, music, and painting but of science as well. The scientific method is meant to prove or disprove a hypothesis. But the hypothesis – the hunch, the what-if – has to come from somewhere. Angels must be entertained.
The great thing is that play needs little in the way of investment or accessories. It just needs freedom to happen. Even at 11 months, Maddy has all sorts of play options, says Stephanie. One of them is going out and looking at trees: “We hold the leaves. This one is green. This one is brown. It doesn’t cost anything.”
John Yemma is the editor of The Christian Science Monitor.
Benjamin Franklin enjoyed no familial boost of wealth, fame, or education. He was one of 17 kids. His father was a candle-maker. In today’s terms, he started life well within the 99 percent. But through systematic study, practice, thrift, and most of all an optimistic embrace of possibility, he rose to become the famed intellect, scientist, statesman, diplomat, and writer honored on currency, in statues, and by namesake institutes dedicated to higher education.
Everybody should be a Franklin. India needs up to 100 million Franklins.
The world’s most populous democracy has been growing at a healthy clip but a moment of truth is looming. In a mere eight years, 100 million more people – equivalent to the population of Mexico – will enter the Indian workforce. Vanishingly few of them will enjoy a familial boost of wealth, fame, or education.
In the old India, those 100 million would be a nightmare, a tsunami of humanity racing toward an overburdened infrastructure and social system. In the new India, they are possibly something else. It all depends on where they sit on the scale that runs from basic need to productivity to creativity. It all depends on what the magic of education can do for them.
Let me take you on a detour to explain why this is so important. In the late 1980s, I was on a reporting assignment in Japan, which at the time was considered an economic superpower in the way that China is today and India hopes to be. There was no doubt about the productivity of Japanese workers. The watchword at the time was kaizen, meaning “continuous improvement,” and Japanese workers seemed to make everything better – TVs, cars, Walkmans, fax machines.
But Naohiro Amaya, a government adviser on education, was worried. From the mid-19th century until the late 20th century, he said, Japan had succeeded in producing a high number of moderately educated people. That helped raise the overall capabilities of Japan as it transitioned from feudalism to industrialization. “Standardized people of pretty high quality worked pretty well,” he said, “but now they won’t meet the demand of future Japanese society.” It was not enough to be better and better at what already existed. Japan needed Franklins – innovators who would ask basic questions, experiment, see the unseen. Without breakthrough thinkers, the Japanese miracle would stall.
Japan has been stalled for two decades. Now think of the stakes in India: Japan’s population is decreasing; India’s is growing faster than that of any other nation on the planet.
Some nations are blessed with natural resources. Some have a legacy of wealth by virtue of past conquests or economic achievements. But the ones that prosper generation after generation have a culture of Franklins – smart people eager to get smarter. Economist Gary Becker of the University of Chicago, a pioneer in the study of human capital, notes that “large increases in education and training have accompanied major advances in technological knowledge in all countries that have achieved significant economic growth.”
India’s challenge is epic in scale. The Monitor’s Ben Arnoldy, who has just completed a three-year assignment in India, produced a special report in what India faces. Ben notes, “Aside from the eye-popping number of colleges India hopes to build – some 50,000 in a decade – the country faces a challenge of reforming how it teaches to produce knowledge workers. The good news on this front is that many Indians love to debate and argue. The challenge is to get teachers who will allow that energy into the classroom.”
Education of any kind helps. But as Naohiro Amaya knew, the best kind of education doesn’t just produce productive workers. It frees thought. That is what India – and every other nation – needs most.
John Yemma is the editor of The Christian Science Monitor. To comment on this column, please e-mail firstname.lastname@example.org.
We can laugh about it now, but in 1999 many serious people were concerned about the Y2K computer bug the bringing modern world to its knees. Combined with the millennial turn of the calendar, Y2K prompted all sorts of predictions about what the new year would bring. Some were optimistic. Some were ominous.
Jan. 1, 2000, dawned. Nothing bad happened. Perhaps all the worry caused responsible people to fix their computers. Or maybe it was just hype to begin with. The main point is that no one got the future right.
We seldom do. In his 2011 book, "Future Babble: Why Expert Predictions are Next to Worthless, and You Can Do Better," Dan Gardner explores famous forecasts and why they failed. There are many reasons we get the future wrong, he says. For one thing, we predict based on the present, so the future is an extrapolation, a pumped-up version of today. That's why in the 1930s prognosticators saw flying cars but not the Internet, enormous battle tanks but not stealth drones.
We also generally stay with conventional wisdom. If everybody in the late 1980s says the Japanese economy is unstoppable, no one breaks from the pack to predict the lost decades of the '90s and '00s. What does that say about China now? "Take a coin from your pocket," Mr. Gardner writes. "Flip it. You'll have a 50 percent chance of being right, which is as good as that of the experts."
That wisdom can be applied to the stock market, to politics, technology, consumer tastes, and most complex events. So why bother predicting? Because we have no choice. We try to make sense of the unfolding future as we travel toward it. Gardner's best advice: Bring a large suitcase of humility on your journey. "Stride confidently forward in the dark and you're likely to feel quite pleased with yourself right up until the moment you walk into a wall."
At the turn of the year, Monitor correspondents took stock of the big stories of 2011: post-occupation Iraq, Afghanistan, the Arab uprisings, Europe's financial crisis, affluent China's newfound soul-searching, the economic rise of Latin America, Africa's surprising self-help boom. Each article stretches forward as far as possible. But by mid-January, we admit, the world may look different in ways we can't predict.
We know a few things for sure about 2012: Madonna is set to perform at the halftime of the Super Bowl; the Summer Olympics kick off in London in July; NASA's "Curiosity" rover is due to land on Mars in August; the US presidential election takes place on Nov. 6. Oh, and pencil in the end of the world for December.
Not really. I'm only mentioning the end-of-world because in 2012 you are going to hear a fair amount of buzz about it. Remember how the Gregorian-calendar millennium occasioned all sorts of anxiety? A similar subcultural stir surrounds 2012 and the Mayan calendar.
Now, the Mayans had a fascinating civilization and left behind stunning ruins in the jungles of central America. They were pretty good astronomers. But most civilizations have had pretty good astronomers, the better to predict when to plant and harvest. Even if the Mayan calendar is due to click over in 2012, there's no evidence they saw the world ending, despite what faddists say. And there's no evidence they were any better at predicting the future than we are.
Jan 1, 2013, will dawn. Life will go on.
Why be so confident? Because a strong argument for life going on can be found in the lives and aspirations of people who are building the future. See, for instance, this report (click here and here and here) profiling 30 remarkable people under 30 years old. Each in her or his own way is casting a line into the mid-21st century -- in agriculture, social media, green transportation, the arts, human rights, information technology, politics.
If you must extrapolate about the future, you couldn't do much better than to start with the clever, hopeful, humanity-embracing ideas that these 30 under 30 -- and millions like them -- are pursuing.