Skip to: Content
Skip to: Site Navigation
Skip to: Search


Formal human 'fingerprints' on polar climate

By Peter N. SpottsBy Peter N. Spotts, guest blogger for The Christian Science Monitor / October 31, 2008

The Advanced Microwave Scanning Radiometer (AMSR-E), a high-resolution passive microwave instrument on NASA’s Aqua satellite, shows the state of Arctic sea ice in this September 16, 2008, file photo.

REUTERS/NASA/Goddard Space Flight Center Scientific Visualization Studio

Enlarge

So you heard the Arctic was the only pole percolating? Last year, the UN-backed Intergovernmental Panel on Climate Change pegged Antarctica as the only continent on the planet where human influence on climate hasn't popped up. Now, it appears, human "fingerprints" are finally showing up on climate  at the bottom of the world, as well as at the top. That's the latest word from a team of researchers in the US, Britain, and Japan.

Skip to next paragraph

Recent posts

For anyone who has followed changes in the Arctic, never mind changes in more-temperate areas of the world, this polar update is unlikely to come as a big surprise. For a sampler of what's happening up north, try the National Oceanic and Atmospheric Administration's  catalog of changes under way there.  But the team also finds a small but statistically significant increase in Antarctica's  average temperatures – although that varies significantly by location on the continent. Even in the Arctic's case, a region feeling the heat far more dramatically than Antarctica,  scientists haven't crossed the T's or dotted the I's by formally linked the changes to increased industrial emission of greenhouse gases – mainly carbon dioxide.

To make that kind of tie-in, scientists conduct a climatological CSI effort dubbed an attribution study. Typically, a team gathers real-world temperatures and calculates how much each year's average was above or below the climatological "normal." Trends they are interested in appear as extended periods of above- or below-normal temperatures. Then they use one or more climate models to see which combination of factors best approximates real-world trends. The factors include changes to natural influences lying outside the atmosphere, such as variations the sun's output and explosive volcanic eruptions, which periodically launch climate-cooling aerosols into the stratosphere. Also included: natural and human-induced changes within the climate system, including measured increases in greenhouse gases – mainly carbon dioxide -- from industry and deforestation.  Scientists conclude that CO2's  growth as an atmospheric gas since the dawn of the Industrial Revolution is the main trigger for the warming that the global climate has experienced.

The new study, published online Oct. 30 by the journal Nature Geoscience,  falls into this  attribution category. The team, however, added a new wrinkle. It only took temperatures from measuring stations that closely match spots over the polar regions where the models explicitly calculate conditions.  This means "they were comparing apples and apples," says Andrew Monaghan, an atmospheric scientist who wasn't involved in the project. Dr. Monaghan  hangs his professional hat at the National Center for Atmospheric Research in Boulder, Colo.

The team blended results from four of the most advanced climate models in the US, Britain, and Japan. First they assumed only natural "forcings" on the climate. The team found that the simulated temperature trends failed to capture the rise in the number and magnitude of unusually warm years since the 1970s that the measured changes show. Then they tossed in the measured increases in greenhouse-gas emissions. Simulated temperatures showed the rise, even in Antarctica, where the presence of the ozone hole appears to have had a cooling influence over the continent.

Read Comments

View reader comments | Comment on this story