A means of measuring the brightness of a star or other body: the lower the body's
magnitude value, the brighter it appears in the sky. A star of magnitude +5.0 is dim,
and hardly visible to the naked eye, while a star with a magnitude close to 0.0 (such as
Capella in Auriga) is very bright indeed.
The system of stellar magnitudes is more than two thousand years old. Its first known use
was by the Greek astronomer Hipparchus in about 120 BCE, whose informal system classified stars as
first magnitude for the brightest down to sixth magnitude for the faintest.
Especially with the advent of the telescope, this kind of informal approach proved inadequate, and in the middle of the
nineteenth century, standards for the calculation of magnitudes were introduced. According to these standards,
which are still in use today, a difference in brightness of five magnitudes is equivalent to factor of 100. For example,
Alpha Centauri (with a magnitude of very nearly 0.0) is one hundred times brighter than
Marfik in Hercules, one of the faintest named
stars with a magnitude of exactly +5.0. It follows from this that a difference of one
magnitude corresponds to a difference in brightness of a little over 2½ times.
The magnitude scale is calibrated to the brightnesses of about a hundred specific stars, all
lying within a few degrees of the Northern Celestial Pole and thus termed the
North Polar Sequence. By coincidence, the North Polar
constellation, Ursa Minor, provides a
useful 'key' to stellar magnitudes. The four stars that form
the rectangular 'bowl' of the Little Dipper each represent a different level of brightness.
Kochab, the brightest, has a magnitude of +2.1; Pherkad
is magnitude +3.0, Zeta Ursae Minoris is +4.3, and the faintest,
Eta Ursae Minoris, has a magnitude of +5.0.
+6.5 is conventionally regarded as the limit of visibility with the naked eye. The actual limit, of course,
will depend on the observer, but few people can see objects fainter than this value. Since the magnitude scale
works by the relative brightness objects, though, it can continue below this threshold of visibility. Objects of
magnitude +6.0 are 2½ times fainter than those of magnitude +5.0, and so on. At the time of writing,
for example, the planet Pluto has a magnitude
of +13.8, far far too faint to be seen with the naked eye, but calculable as about 400,000 times fainter than
A few very bright stars and certain objects within the
Solar System exceed magnitude 0.0, and therefore need
negative numbers to describe their brightness. There are four stars with negative magnitudes:
Sirius (-1.4), Canopus (-0.6),
Arcturus (-0.1) and Alpha Centauri, with a
magnitude that comes extremely close to the zero mark at -0.01. Within the
Solar System, all of the planets visible to the
naked eye have negative magnitudes for at least some of the time. The brightest objects of all are the
Moon (which can reach -13.0 or brighter) and, of course, the
Sun, whose magnitude value varies around an extreme -26.7.
To a great extent, the brightness of objects in the sky depends on their position in space.
Sirius is the brightest star
in the sky, but it is a dwarf star, and not particularly
luminous in stellar terms - it appears brilliant
because it is less than nine light years away. By comparison, the star
Deneb in Cygnus appears much fainter than
Sirius in the sky, but actually a
supergiant thousands of times more
luminous than Sirius - it appears fainter because it is
more than three thousand light years away.
For this reason, the brightness of objects as they appear in the sky is properly referred to as their
apparent magnitude. A separate scale exists -
absolute magnitude - to describe the intrinsic brightness of an
object, irrespective of the location of its observer. This concept is discussed in its
own entry on this site.