You are watching an archery tournament when you start wondering how fast an arrow is shot from the bow. Remembering your physics, you ask one of the archers to shoot an arrow parallel to the ground. Unfortunately the archer stands on an elevated platform of unknown height. However, you find the arrow stuck in the ground 62.0 m away, making a 2.00 degree angle with the ground.

How fast was the arrow shooting?

5 m/s

To determine the speed at which the arrow was shot, we can use the principles of projectile motion.

First, let's break down the information given:
- The arrow was shot parallel to the ground, meaning it had no vertical component of velocity initially.
- The arrow landed 62.0 m away, forming a 2.00 degree angle with the ground.

To get the initial velocity (horizontal component) of the arrow, we need to find the vertical component of the velocity. To do this, we can use trigonometry. The angle of 2.00 degrees is small, so we can approximate the tangent of the angle as the angle itself in radians.

tan(2.00 degrees) ≈ 2.00 degrees * (π/180 degrees)
≈ 0.0349

Therefore, the vertical component of the velocity is 0.0349 times the magnitude of the initial velocity.

Now, we can use the range equation of projectile motion:

Range = (Initial Velocity)^2 * sin(2θ) / g

Where:
- Range is the horizontal distance traveled by the projectile (62.0 m).
- θ is the angle of projectile motion (2.00 degrees).
- g is the acceleration due to gravity (approximately 9.8 m/s^2).

Rearranging the equation, we can solve for the initial velocity:

(Initial Velocity)^2 = Range * g / sin(2θ)

Initial Velocity = sqrt(Range * g / sin(2θ))

Plugging in the values:

Initial Velocity = sqrt((62.0 m) * (9.8 m/s^2) / sin(2.00 degrees))

Calculating this expression, we find that the initial velocity of the arrow is approximately 46.0 m/s.

So, the arrow was shot at a speed of about 46.0 m/s.