There’s a reason the winter sky is so full of sparkling stars. A great many are concentrated around the constellation Orion the form of a gigantic hexagon. How big is it? I made a fist and reached my gloved hand to the sky last night to measure – 6 fists high by 4 fists wide or 60 x 40 degrees. This six-sided figure of celestial real estate reaches from Sirius, low in the southern sky, all the way up to Capella, located nearly overhead from mid-northern latitudes.
What makes these stars special is how bright they are. They all shine at 1st magnitude (or brighter) and appear on the list of the Top 25 brightest stars. Joining the clan are Jupiter, more luminous than any of them, and Castor slightly fainter than the faintest.
Astronomers use the magnitude scale to measure star and planet brightness. Each magnitude is 2.5 times brighter than the one below it. Aldebaran, which shines at 1st magnitude, is 2.5 times brighter than a 2nd magnitude star, which in turn is 2.5 times brighter than a 3rd magnitude star and so on.
A first magnitude star is 2.5 x 2.5 x 2.5 x 2.5 x 2.5 (about 100) times brighter than a 6th magnitude star.
The bigger the magnitude number, the fainter the star. On the other hand, if an object is really bright, it’s assigned a negative magnitude. Sirius, the brightest star sparkles at magnitude -1.4, Jupiter at -2-2 (currently) and Venus brighter yet at -4.4. The full moon reaches a magnificent -12.7, topped only by the sun at -26.7.
An object’s brightness has much to do with its distance from Earth. Small things like planets, the moon or even an asteroid can look bright if close, while a brilliant supergiant star can appear faint simply because it’s far away.
To get a better appreciation of an object’s true or absolute brightness, astronomers assign it an absolute magnitude, based on how bright it would appear when moved to a distance of 10 parsecs (equal to 32.6 light years) from the sun. When stars are all placed at the same distance, absolute magnitudes show differences in true star brightness.
A parsec is the distance from the Sun to an astronomical object which has a parallax angle of one arc second – parallax second – against the background sky. Parallax, which is measured in arc seconds or tiny fractions of a degree, is the apparent shift of a nearby star against the distant background of stars as seen from either end of Earth’s orbit.
One parsec equals 3.26 light years. Click HERE for a blog I wrote explaining parallax. The main thing to remember is we’re comparing objects at the same distance of 10 parsecs from the sun.
Here are the apparent (what we see with the eye) and absolute magnitudes (in parentheses) of our featured stars::
* Sirius -1.5 (1.4)
* Procyon 0.4 (2.6)
* Pollux 1.1 (0.7)
* Capella 0.1 (0.4)
* Aldebaran 0.9 (-0.3)
* Rigel 0.1 (-8.1)
* Jupiter -2.2 (55)
* Betelgeuse 0.5 (-7.2)
* Castor 1.6 (0.5)
* Our sun -26.7 (4.8)
Right away you’ll see some dramatic differences in intrinsic brightness. Rigel and Betelgeuse, both of which appear more than a magnitude fainter than Sirius to the eye, far outshine all the others. Seen from 10 parsecs, each puts out enough light to cast shadows at night. Why? They’re both extremely luminous supergiant stars. Jupiter, the big shot of the bunch, fades out of sight.
Sirius, only twice as big as the sun, dims to a rather meek mag. 1.4. It’s overtaken by otherwise mild-mannered Castor, a double star with suns 2.4 and 1.9 times larger than our own. How does our sun fare at 10 parsecs? Not so good. At magnitude 4.8, it would blend into the background of faint stars. Unless you looked carefully, you wouldn’t even notice it.
Knowing a star’s absolute magnitude gives us a true picture of a star’s brightness. What’s more, you can derive a star’s distance by comparing its apparent magnitude to the absolute magnitude. Want to have a little fun? Click on the Magnitude and Luminosity Calculator and play around with some of your favorite stars.