From the January 2012 issue

What is the baseline for determining the magnitude scale of celestial objects? Why do brighter objects have negative numbers?

Dean Treadway, Knoxville, Tennessee
By | Published: January 23, 2012 | Last updated on May 18, 2023
Astronomical-brightness
The star Vega has an apparent magnitude of 0. NASA/JPL-Caltech/University of Arizona
The first observer to catalog differences in star brightnesses was Greek astronomer Hipparchus. He created a catalog around 135 b.c. of roughly 850 stars divided into six ranges. He called the brightest 1st magnitude and the faintest 6th magnitude. Observers used this system for more than 1,500 years.

But then came Galileo Galilei. In addition to discovering the phases of Venus, Jupiter’s large moons, and more, he noted that his telescope did not simply magnify — it revealed the invisible. In 1610, Galileo coined a term that had not been used before when he called the brightest stars below naked-eye visibility “7th magnitude.”

The telescope, therefore, demanded an expansion of Hipparchus’ magnitude system, but not only on the faint end. Observers noted that 1st-magnitude stars varied greatly in brightness. Also, to assign a magnitude to the brightest planets, the Moon, and especially the Sun, scientists would have to work with negative numbers.

In 1856, English astronomer Norman R. Pogson suggested astronomers calibrate all magnitudes so that a difference of 5 magnitudes would equal a brightness difference of 100. (For example, a 1st-magnitude star is 100 times brighter than a 6th-magnitude one.) We still use Pogson’s formula today.

Astronomers routinely use two main divisions of magnitudes to describe the same object. “Apparent magnitude” describes how bright an object looks. Back in the day, observers measured apparent magnitudes by eye. Now ultrasensitive CCD cameras provide measurements with accuracies of 0.01 magnitude.

With “absolute magnitude,” astronomers indicate how bright an object really is. Two things determine this number (also called luminosity): apparent magnitude and distance. Absolute magnitude defines an object’s brightness if it were exactly 10 parsecs (32.6 light-years) from Earth. So any object closer than 32.6 light-years has an apparent magnitude brighter than its absolute magnitude. For any object farther away, the absolute magnitude is brightest. — Michael E. Bakich, Senior Editor, Astronomy magazine