# Math

posted by
**Anonymous** on
.

How bright a star appears can depend on how much light the star actually emits and how far away it is. The stellar magnitude scale can be adjusted to account for distance as follows:

M2-M1 = log (b1 / b2)

Here, M refers to a star's absolute magnitude, that is, how brightly it appears from a standard distance of 10 parsecs (or 32.6 light-years). The absolute brightness of Sirius is 1.4 and the absolute brightness of Betelgeuse is -8.1.

Which of these two stars is brighter, in absolute terms, and by how much?