Star Magnitude (Brightness) ScaleFeb 05, 2021
A basic stargazing principle is that of STAR MAGNITUDE, or how bright a star is. Astronomers classify stars based upon their brightness. The magnitude system we see today starts with Hipparchus, a Greek astronomer of the second century. He classified 850 stars ranging from 1 to 6 in terms of brightness. The brightest stars were given the designation of '1st magnitude', while the dimmest stars were of '6th magnitude'. Ptolemy and Galileo continued this practice. Ptolemy added more stars to the original catalog, while Galileo labeled stars only visible with magnification as '7th magnitude, 8th magnitude,' and more. It wasn't until 1856 when this system was standardized. English astronomer Norman Robert Pogson concluded that a 1st magnitude star is 100 times brighter than a 6th magnitude star. This logarithmic scale is still used today.
The star magnitude scale has its limits. First, it is an inverse scale, so the brighter the star or object, the lower the magnitude value. This can be confusing at first. Atmospheric disturbances can also change a star's brightness. Human eyes are more sensitive to color than light. Despite the limitations, it still can help observers discover the brightest stars and navigate the night sky.
The magnitude system is helpful when using a star map. The stars with 1st and 2nd magnitude will the brighter ones in the sky. The dimmest stars will be of 5th and 6th magnitude, and dark skies will be needed to see these stars. The magnitude for humans is 6th magnitude. If an object has a magnitude of 7 or more, magnification will be needed to see it. Objects such as planets and the moon have negative magnitude values because they are so bright in the night sky. Our sun has a magnitude of -26! Some other stars, such as Sirius, can have negative values as well. Remember, the brighter an object, the lower its magnitude value will be.