A hot-air balloon is descending at a rate of 1.5 when a passenger drops a camera.If the camera is 41 above the ground when it is dropped, how long does it take for the camera to reach the ground?

If you do not indicate units, the answers given, if any would be meaningless.

g=32.2 ft/s² = 9.8 m/s²

We cannot guess what units you're using.

In any case, there is a very similar problem posted recently, to which you piggy-backed a question that remain unanswered.

http://www.jiskha.com/display.cgi?id=1284689510

We'll be glad to comment on your answers.

To find out how long it takes for the camera to reach the ground, we can use the formula for distance traveled:

distance = rate × time

In this case, the rate at which the camera is descending is given as 1.5 m/s. The initial height of the camera is 41 m. We want to find the time it takes for the camera to reach the ground, so let's say the time is "t" seconds.

Using the formula, we can set up the following equation:

41 = 1.5t

To solve for "t", divide both sides of the equation by 1.5:

t = 41/1.5

Now, we can calculate the value of "t":

t = 27.33

Therefore, it takes approximately 27.33 seconds for the camera to reach the ground.