A baseball player throws a ball at an angle N of E from 3rd base to 1st base. if it takes 1.2 seconds to get there, at what velocity did he throw it? (bases on a baseball field are 90 feet apart)

-9.8=0-27.432m/ .24

v=distance/time=90sqrt2/1.2 ft/second

To find the velocity at which the baseball player threw the ball, we can use the formula:

Velocity (v) = Displacement (d) / Time (t)

In this case, we need to calculate the displacement of the ball. The ball is thrown from 3rd base to 1st base, which is a distance of 90 feet or 27.432 meters.

Now, we need to convert the time it takes for the ball to travel, which is given as 1.2 seconds, into seconds. The formula for conversion is:

Time (in seconds) = 1.2

Now, we can substitute these values into the formula to find the velocity:

Velocity (v) = 27.432 meters / 1.2 seconds

Calculating this expression, we get:

Velocity (v) = 22.86 meters/second

Therefore, the baseball player threw the ball at a velocity of approximately 22.86 meters/second.