Stars' real ages shine through astronomers' infrared window

YOU may not have thought of it that way, but the surface of the sun you see shining in the sky is a fossil. It preserves the relative abundances of elements that characterized the interstellar gas from which the sun condensed some 5 billion years ago. Why then should the sun seem younger in this respect than interstellar gas clouds that have formed since then? These clouds, called nebulae, have appeared to have a relative abundance of nitrogen to oxygen that would make them older than the sun.

Astronomers interested in the evolution of stars and creation of the chemical elements have puzzled over this for some time. Some even wondered if they understood the workings of stars like our sun as well as they thought they did.

Now, thanks to new infrared observations of the gas clouds made by scientists at the NASA Ames Research Center, the puzzle has been solved. The nebulae show their proper age and a nagging anomaly in cosmic research has been removed.

Once again, the ability to observe with radiation that doesn't penetrate the atmosphere has given astronomers a useful new insight.

In this case, it's the window on the universe opened by infrared radiation. This is electromagnetic radiation with wavelengths longer than that of light but shorter than radio waves. Atmospheric gases, especially water vapor, absorb many infrared wavelengths. But over the past decade or so, instruments on satellites or aircraft have opened the infrared window for astronomers.

Ames scientists Edwin Erickson, Michael Haas, Robert Rubin, and Janet Simpson took advantage of infrared observing equipment on NASA's Kuiper airborne observatory to study the nebulae. Flying about eight miles high, they had a good view of infrared emissions from nitrogen and oxygen in young gas clouds. These showed that the interpretation based on visible light data was mistaken. There is no age anomaly after all.

According to modern theories of stellar evolution and element formation, hydrogen and helium appeared with the birth of the universe itself. But heavier elements form within stars. Thus as successive generations of stars are born, evolve, and die, they enrich the interstellar medium with the elements they create. This means that young nebulae that formed relatively recently should have a different mix of elements than, say, a star like the sun that formed from a gas cloud some 5 billion years ago.

The sun's outer layer, for example, has 12 percent as much nitrogen as it does oxygen, reflecting the makeup of that 5 billion-year-old nebula. The theory of how our galaxy's supply of elements evolves as successive generations of stars produce it predicts that this percentage should be higher in younger nebulae than it is in the sun. But the visible light data seemed to indicate that it was actually lower. Hence the puzzle. Now the Ames team has shown that the young nebulae do indeed have the expected higher ratio of nitrogen to oxygen - twice the solar value - and the theory is vindicated.

A Tuesday column. Robert C. Cowen is the Monitor's natural science editor.

We want to hear, did we miss an angle we should have covered? Should we come back to this topic? Or just give us a rating for this story. We want to hear from you.