Skip to: Content
Skip to: Site Navigation
Skip to: Search


Perseids bring fiery show to August sky

(Page 2 of 2)



The rotation of Earth, combined with its summer tilt southward, causes the viewing angle of the point of impact to shift each night from the horizon in early evening toward the zenith later. Viewing is best when the shower is directly overhead. "Midnight to four, you'll see more," not only rhymes, but is Astronomy 101 when it comes to seeing the Perseids.

Skip to next paragraph

Felicitously, the moon will be absent from northern skies between Aug. 4 and 13. There will be no moonlight to compete with (obstruct) the flare that goes up on impact. This is no small point since to the naked eye the Perseids shine in a dark sky at what astronomers call a magnitude 3 (see story below on the "apparent brightness scale"). Although, thin cirrus clouds can hinder viewing.

A dark backyard in the burbs on a clear night is a safe bet for a good look. Ideally, head for the country, far from city lights.

A measure of the dimmest stars

Light from the Andromeda galaxy, faintly seen with the naked eye in very dark skies, took 2-1/2 million years to reach Earth. And though billions and billions of stars comprise Andromeda, the fact remains that it is so distant the unaided eye only sees it as a faint, tiny cloud in space.

There are two reasons stars are unequally bright in the sky. Individual stars are at different distances from Earth. Stars emit differing degrees of brightness based on their size and intrinsic thermonuclear properties. It wasn't until this past century that astronomers realized just how great the distances from Earth to stars actually are.

But before astronomers even had the telescope or the many sensitive light-sensing instruments they possess today, they had a short-hand way of measuring the brightness of stars (and other objects like galaxies, even if they didn't know they were galaxies) as seen from Earth. They used a scale called "apparent magnitude" to quickly indicate the brightness of stars as they appear to observers on Earth.

This apparent brightness scale is an old scale, invented by the astronomer Hipparchus, who lived and worked in the city of Alexandria in northern Africa 2,100 years ago. At the time, Alexandria was a hotbed of astronomical observations.

Hipparchus called bright stars, like Rigel and Betelgeuse in the constellation Orion, stars of the "first magnitude." He called the faintest stars, "sixth magnitude" stars. That's as far as he could see. (Remember, this is all by the naked eye because the telescope wasn't invented until more than 1,600 years later.) Andromeda is a sixth magnitude object.

Today, astronomers still use Hipparchus's scale as a basis for measuring the brightness of stars and other celestial objects, but it has been made much more exact. With the invention of the telescope it was possible to extend the scale of apparent brightness to see stars much fainter than the sixth magnitude. The higher you go in the scale the fainter the object. The Hubble telescope can see to a magnitude of plus 32 and plus 33.

At the other end of the scale, a few stars (or planets like Venus, Mars, Mercury, and Jupiter – and, of course, the sun and moon) are so bright they have been assigned negative magnitudes on the modern scale.

For the amateur astronomer, seeing the numbers plus 0.7 in front of Saturn immediately indicates this will be easily visible. Whereas one of its moons, Tethys at plus 10.2, means a telescope at least 6 inches in diameter is needed.

The brightness scale can be precise, but remember it is only telling you the luminosity of an object. It cannot determine distance or size.

A good example of this is the planet Mars with a minus 2.0 magnitude,while its two moons, Phobos plus 11.3 and Deimos plus 12.4, require a powerful telescope to be seen at all. Though each is tiny relative to Mars, each is relatively close to Mars. Both are difficult to spot because of the light cast by Mars.

Permissions