Horizontal velocity: 138ft/s

Vertical velocity: 109ft/3
A ball is thrown at 95 mi/h. The distance from the mound to the plate is 60.5 ft. Assuming the ball is pitched parallel to the ground, how far does it fall due to gravity by the time it reaches the plate?
Show calculations.

To find out how far the ball falls due to gravity by the time it reaches the plate, we can use the kinematic equations of motion.

Let's first convert the initial velocity from miles per hour (mi/h) to feet per second (ft/s):

95 mi/h * 5280 ft/mi * 1 hr/3600 s = 139.33 ft/s

Given that the vertical velocity is 109 ft/3, we can find the time it takes for the ball to reach the plate using the equation:

vertical velocity = acceleration * time
109 ft/3 = -32.15 ft/s^2 * time

Solving for time, we get:
time = (109 ft/3) / (-32.15 ft/s^2) ≈ -1.0727 s

Since the acceleration due to gravity is downward, we have a negative sign in the equation.

Next, we can use the horizontal velocity (138 ft/s) to calculate the horizontal distance the ball covers in this time period:

horizontal distance = horizontal velocity * time
horizontal distance = 138 ft/s * (-1.0727 s) ≈ -147.78 ft

Again, the negative sign indicates that the ball moves in the opposite direction of the pitching.

Therefore, by the time the ball reaches the plate, it falls approximately 147.78 feet due to gravity.

To find how far the ball falls due to gravity by the time it reaches the plate, we can use the vertical velocity and the time it takes for the ball to reach the plate.

First, let's convert the given vertical velocity from ft/3 to ft/s. Since 1 mile = 5280 feet and 1 hour = 3600 seconds, the vertical velocity will be:

Vertical velocity = 109 ft/3 * (3/1) s = 327 ft/s

Next, we need to determine the time it takes for the ball to reach the plate. To find this, we can use the equation:

distance = velocity * time

Since the distance from the mound to the plate is given as 60.5 ft and the horizontal velocity is given as 138 ft/s, we can rearrange the equation to solve for time:

time = distance / velocity

time = 60.5 ft / 138 ft/s = 0.4384 s

Finally, we can calculate the distance the ball falls due to gravity by multiplying the vertical velocity by the time:

distance fallen = vertical velocity * time

distance fallen = 327 ft/s * 0.4384 s ≈ 143.3 ft

Therefore, by the time the ball reaches the plate, it would have fallen approximately 143.3 feet due to gravity.

Your vertical and horizontal velocity components are inconsistent with the 95 mph velocity. Where did you get them? If the ball is thrown horizontally, the vertical component is initially zero.

95 mph = 139.3 ft/s is the constant horizontal velocity component. The vertical velocity component starts at zero, at t = 0, and equals g*t after that.

time to reach home plate = 60.6/139.3 = 0.435 s

Calculate how far the baseball falls in that time. You should know the equation for that.