Suppose that the gravitational acceleration on a certain planet is only 3.1 m/s2. A space explorer standing on this planet throws a ball straight upward with an initial velocity of 35 m/s

a) What is the velocity of the ball 4 seconds after it is thrown?
m/s

(b) How much time elapses before the ball reaches the high point in its flight?
s

i need help

(a)

v = 35 - 3.1t
solve for v when t=4

(b)
solve for t when v=0

(a) Well, on this planet, with a gravitational acceleration of 3.1 m/s2, the ball will definitely have a slower progress. So, let's calculate the velocity after 4 seconds. Now, we know the initial velocity is 35 m/s and the acceleration due to gravity is -3.1 m/s2 (negative because it's acting against the motion). We can use the equation v = u + at, where v is the final velocity, u is the initial velocity, a is the acceleration, and t is the time. Plugging in the values, we get:

v = 35 m/s + (-3.1 m/s2) * 4 s
v = 35 m/s - 12.4 m/s
v = 22.6 m/s

So, the velocity of the ball 4 seconds after it is thrown is 22.6 m/s.

(b) Now, to find out how much time elapses before the ball reaches the high point in its flight, we need to find the time it takes for the velocity to become zero. At the highest point, the velocity becomes zero momentarily before switching direction. We can use the equation v = u + at again, where v is 0 m/s, u is 35 m/s, a is -3.1 m/s2, and t is the time. Plugging in the values, we get:

0 m/s = 35 m/s + (-3.1 m/s2) * t
-35 m/s = -3.1 m/s2 * t
t = -35 m/s / -3.1 m/s2
t = 11.29 s

So, it takes approximately 11.29 seconds for the ball to reach the highest point in its flight.

To find the velocity of the ball 4 seconds after it is thrown, we can use the equation of motion:

v = u + at

where:
v = final velocity (unknown)
u = initial velocity = 35 m/s
a = acceleration = -3.1 m/s^2 (negative because it is in the opposite direction to the motion)
t = time = 4 seconds

Substituting the given values into the equation, we have:

v = 35 m/s + (-3.1 m/s^2) * 4 s

Now, let's calculate the velocity:

v = 35 m/s - 12.4 m/s
v = 22.6 m/s

So, the velocity of the ball 4 seconds after it is thrown is 22.6 m/s.

To find the time it takes for the ball to reach the high point in its flight, we need to consider that at the highest point, the velocity becomes zero.

Using the equation:

v = u + at

At the highest point, v = 0 m/s and a = -3.1 m/s^2. Let's call the time it takes for the ball to reach the high point as t1.

0 = 35 m/s + (-3.1 m/s^2) * t1

Now, let's solve for t1:

-35 m/s = -3.1 m/s^2 * t1
t1 = -35 m/s / (-3.1 m/s^2)

t1 ≈ 11.29 seconds

Therefore, it takes approximately 11.29 seconds for the ball to reach the high point in its flight.