A tennis player hits the ball horizontally with a speed of 30m/s at 1.5m above the ground.

How far away will it land if the ball was hit at a speed of 35m/s

So is it 30 m/s or 35 m/s?

Let the initial velocity be u m/s, which remains constant while it is airborne.

Time to drop down 1.5 m vertically with g=9.81 m/s^2 is given by
S=(1/2)gt^2
or
t=√(2S/g)=√(2*1.5/9.81)=0.553 s
Horizontal distance covered before touching ground (assuming a perfectly horizontal tennis court, which is never the case because of drainage)
=0.553*horizontal velocity.

To determine the distance that the ball will land when hit at a speed of 35m/s, we need to use the physics concept of projectile motion.

Projectile motion is the motion of an object that is launched into the air and moves along a curved path under the influence of gravity. In this case, the tennis ball is hit horizontally, which means that its initial vertical velocity is zero. The only force acting on the ball is gravity, which causes it to accelerate downward at 9.8 m/s^2.

The equation that relates the distance traveled by a projectile, the initial horizontal velocity, and the time of flight is:

distance = initial horizontal velocity * time

Since the ball was hit horizontally, the initial horizontal velocity is the same as the speed at which it was hit.

To find the time of flight, we need to consider the vertical motion of the ball. The formula for the time of flight (or time taken to reach maximum height) for an object launched vertically is:

time = (2 * vertical velocity) / acceleration due to gravity

In this case, the vertical velocity is zero (since the ball is hit horizontally), so the time of flight is also zero.

Therefore, the distance that the ball will land is simply:

distance = initial horizontal velocity * time

distance = 35 m/s * 0 s

distance = 0 meters

Hence, if the ball is hit horizontally with a speed of 35m/s, it will land at the same spot where it was hit, assuming a flat surface and neglecting air resistance or other factors.