A baseball player throws a ball at an angle N of E from 3rd base to 1st base. if it takes 1.2 seconds to get there, at what velocity did he throw it? (bases on a baseball field are 90 feet apart)

-9.8=0-27.432m/ .24

see other post

To find the velocity at which the baseball player threw the ball, we can use the formula for horizontal motion:

velocity = displacement / time

In this case, the displacement is the distance from 3rd base to 1st base, which is 90 feet. However, we need to convert this distance to meters because the acceleration due to gravity (represented by -9.8 m/s^2) is given in meters per second squared. Since 1 meter is approximately equal to 3.28 feet, the displacement in meters would be:

displacement = 90 feet * (1 meter / 3.28 feet) = 27.43 meters

Next, we can substitute the given values into the formula:

velocity = 27.43 meters / 1.2 seconds

Evaluating the expression:

velocity ≈ 22.86 meters/second

Therefore, the player threw the ball at a velocity of approximately 22.86 meters/second.