A tennis player hits the ball horizontally with a speed of 30m/s at 1.5m above the ground.

How far away will it land if the ball was hit at a speed of 35m/s

To find out how far away the ball will land, we need to use the horizontal motion equation for projectiles.

The equation we will use is:

d = v * t

Where:
d = horizontal distance traveled by the ball
v = horizontal velocity of the ball
t = time of flight

First, let's calculate the time of flight.

Since the ball was hit horizontally, there is no initial vertical velocity. Thus, the ball will follow a vertical path due to the force of gravity. The time of flight can be found using the vertical motion equation:

h = (1 / 2) * g * t^2

Where:
h = initial vertical height
g = acceleration due to gravity

In this case, the initial vertical height is given as 1.5m, and the acceleration due to gravity, g, is approximately 9.8 m/s^2.

Rearranging the equation to solve for t:

t = sqrt(2h / g)

t = sqrt(2 * 1.5 / 9.8)

t ≈ sqrt(0.3061)

t ≈ 0.5538 seconds

Now that we have the time of flight, we can calculate the horizontal distance traveled using the equation:

d = v * t

Given that v = 35 m/s and t = 0.5538 seconds:

d = 35 * 0.5538

d ≈ 19.377 m

Therefore, the ball will land approximately 19.377 meters away when hit with a speed of 35 m/s.

See:

http://www.jiskha.com/display.cgi?id=1372082354