An airplane has a velocity of 185 miles per hour when it lands, how long must the runway be if the plane decellerates at 2.5 meters per second

are you taking a test? You posted this 6 min ago.

no, just homework, why?

Bobpursley Answered your question before too. Tried to attempt it. But my physics knowledge is limited.

To find the length of the runway, we need to convert the units to either all miles per hour or all meters per second. Let's convert the velocity of the airplane to meters per second.

1 mile = 1609.34 meters (approximately)

To convert 185 miles per hour to meters per second, we multiply by the conversion factor:

185 miles/hour * 1609.34 meters/mile * 1 hour/3600 seconds ≈ 82.768 meters/second

Now we have the initial velocity of the airplane as 82.768 meters/second, and the deceleration as 2.5 meters/second.

We can use the equation of motion to find the length of the runway:

v^2 = u^2 + 2as

Where:
v = final velocity (0 meters/second when the airplane stops)
u = initial velocity (82.768 meters/second)
a = acceleration (deceleration in this case, -2.5 meters/second)
s = distance (runway length)

Plugging in the values, we have:

0 = (82.768)^2 + 2 * (-2.5) * s

Simplifying the equation:

0 = 6864.94 - 5s

Rearranging the equation:

5s = 6864.94

Dividing both sides by 5:

s = 1372.988 meters

Therefore, the runway must be approximately 1373 meters long for the plane to come to a complete stop.