If one star is at a distance of 100 light years, how far away would a second star that is 9 times as luminous, have to be to appear at the same brightness as the first?

You need to make the second star appear 9 times dimmer than it would at 100 l.y. distance. Since the received light is inversely proportional to the distance squared, that means you must triple the distance. That will put it 300 light years away.

To solve this problem, we can use the inverse square law of brightness. The inverse square law states that the brightness of an object decreases as the square of the distance from the observer increases.

First, let's denote the distance of the second star as "d". According to the question, the second star is 9 times as luminous as the first star. Since the brightness decreases with distance, we need to find the distance at which the second star's brightness equals the first star's brightness.

The brightness is inversely proportional to the square of the distance. So, we can set up the following equation:

(Brightness of first star) = (Brightness of second star)
(1/100^2) = (1/d^2)

Now, let's solve for "d":

1/100^2 = 1/d^2
1/10000 = 1/d^2

To solve for "d," we take the reciprocal of both sides of the equation:

d^2 = 10000/1
d^2 = 10000

Finally, we take the square root of both sides:

d = √10000
d = 100

Therefore, the second star would have to be located 100 light years away from us to appear at the same brightness as the first star.