I have a golf ballthat is hit for 0.20s at an angle to the horizontal and it hits the ground some distance away. Initial speed is 30m/s and angle of projection is 35. Calculate the time between when the ball was hit and when it first touched down.

I think it should be 30m/s cos 35/9.8.

If not correct please let me know what formula to use.

Can someone please just check this problem and let me know if its wrong and the correct formula

Thank you

Well, it is correct.

Your formula is incorrect. Dividing the horizontal velocity component by g does not get you the time of flight. Note that, with your formula, the time of flight would be zero if the ball were hit straight up.

The time of flight, T, is the time it takes the ball to go up and then come back down. Those two times are equal.

T = 2 V sin35/g

drwls is correct.

So I should say

2 times (20sin35)/9.8 = 2.34

I'm really confused because one said it's correct and you said what I did is wrong-Please help

reread.

The time of flight is 2*(30sin35)/9.8 = 3.51 seconds.

I do not understand why the question began with a statement that the ball is hit for 0.2 s. You do not need that bit of information to answer the question about the time of flight. You could use it and the ball's momentum leaving the club face to calculate the average force of the club on the ball.

Thank you, I appreciate it-I understand it now

To calculate the time between when the golf ball was hit and when it first touches down, we can use the equations of projectile motion. We'll need to consider the horizontal and vertical components separately.

First, let's find the time of flight, which is the total time the ball is in the air. The horizontal component of the initial velocity remains constant throughout the motion. Therefore, we can find the time of flight using the horizontal distance traveled:

Horizontal distance = (initial horizontal velocity) * (time of flight)

The horizontal velocity can be found by multiplying the initial velocity (30 m/s) by the cosine of the angle of projection (35 degrees):

Horizontal velocity = (30 m/s) * cos(35 degrees)

Next, using the vertical motion equation:

Vertical distance = (initial vertical velocity) * (time) + (0.5) * (-9.8 m/s^2) * (time)^2

Since the ball touches the ground when it first hits the ground, the vertical distance at that time is 0. Therefore:

0 = (initial vertical velocity) * (time) + (0.5) * (-9.8 m/s^2) * (time)^2

We can rearrange this equation to solve for time.

Simplifying and substituting the initial vertical velocity:

0.5 * (-9.8 m/s^2) * (time)^2 + (30 m/s) * sin(35 degrees) * (time) = 0

Now we can solve this quadratic equation for time. In this case, we only need the positive root as negative time does not make physical sense in this problem.

Once we have the time of flight, we know that the time between when the ball was hit and when it first touches down is half of the time of flight since the ball reaches its peak height halfway through the flight.

So the formula to calculate the time between when the ball was hit and when it first touched down is:

Time between hit and touchdown = (Time of flight) / 2

Now you can plug in the values into the equations to get the numerical solution.