If a galaxy is traveling away from us at 2.0% of the speed of light, roughly how far away is it?

t= light years

I got 2.4x10^8 light years but its wrong please help me out...thank you

v=0.02•3•10⁸=6•10⁶ m/s

1 MegaParsec = 3.26•10⁶ ly
Hubble’s Law
v=Hd
Hubble constant
H ~70- 80 km/(s•Mpc)
=> v= 80d or v=70d
d=(6•10⁶/80 000)• 3.26•10⁶ =2.45•10⁸ ly
or
d=(6•10⁶/70 000)• 3.26•10⁶ =2.79•10⁸ ly

To determine the distance to a galaxy that is moving away from us, we can use Hubble's law. Hubble's law states that the velocity of a galaxy is directly proportional to its distance from us. Mathematically, it can be written as:

v = H0 x D

Where:
- v is the velocity of the galaxy (in this case, 2.0% of the speed of light)
- H0 is Hubble's constant (the current value is still subject to debate, but a commonly used approximate value is 70 km/s/Mpc)
- D is the distance to the galaxy

To convert the velocity from a percentage of the speed of light to km/s, we can use the equation:

v = (velocity percentage x speed of light) / 100

v = (2.0% x 300,000 km/s) / 100 = 6,000 km/s

So now we can rewrite Hubble's law equation as:

6,000 km/s = H0 x D

Rearranging the equation to solve for D:

D = (6,000 km/s) / H0

Using the approximate value of H0 as 70 km/s/Mpc:

D = (6,000 km/s) / 70 km/s/Mpc = 85.71 Mpc

Now, to convert Mpc to light years, we can use the conversion factor:

1 Mpc = 3.09 x 10^19 km = 3.26 million light years

Therefore:

85.71 Mpc = 85.71 x 3.26 million light years = 279.43 million light years

So the distance to the galaxy traveling at 2.0% of the speed of light is roughly 279.43 million light years.