Logarithms

The brightness of an astronomical object is called its magnitude. A decrease of five magnitudes increases the brightness exactly 100 times. The sun is magnitude -26.7, and the full moon is magnitude -12.5. The sun is about how many times brighter than the moon?

To determine how many times brighter the sun is compared to the moon, we need to find the difference in magnitudes between the two objects and then convert that difference to a scale factor.

Given that a decrease of five magnitudes increases the brightness by 100 times, we can calculate the difference in magnitudes between the sun and the moon as follows:

Magnitude difference = Sun's magnitude - Moon's magnitude
Magnitude difference = -26.7 - (-12.5)
Magnitude difference = -26.7 + 12.5
Magnitude difference = -14.2

Since a decrease of five magnitudes corresponds to a brightness increase by a factor of 100, we can convert the magnitude difference to a scale factor by dividing it by five and then raising 100 to that power:

Scale factor = 100 ^ (Magnitude difference / 5)

Let's calculate the scale factor:

Scale factor = 100 ^ (-14.2 / 5)
Scale factor = 100 ^ (-2.84)

To evaluate this expression, we use logarithms. The logarithm (base 10) of a number x gives us the exponent to which 10 must be raised to equal x. In this case, we want to find the logarithm of the scale factor base 100:

log(100, Scale factor) = -2.84

To find the value of the scale factor, we need to take the antilogarithm (also known as exponentiation) of both sides of the equation:

Scale factor = 10 ^ (-2.84)

Calculating the value of the scale factor:

Scale factor ≈ 0.003981

Therefore, the sun is approximately 0.003981 times (or about 1/250) brighter than the moon.