You throw a plate off your roof with a speed of 10.0 m/s. 4.00 seconds later it hits the ground? How far?

What equation do i use? Variables explain!

To determine the distance the plate travels before hitting the ground, you can use the equation of motion for vertical motion:

d = v₀t + (1/2)gt²

Where:
- d is the distance traveled (which we want to find)
- v₀ is the initial vertical velocity (which is 10.0 m/s, since the plate is thrown with that speed)
- t is the time elapsed (which is 4.00 seconds)
- g is the acceleration due to gravity (which is approximately 9.8 m/s²)

Substituting the given values into the equation, we can calculate the distance traveled:

d = (10.0 m/s)(4.00 s) + (1/2)(9.8 m/s²)(4.00 s)²
d = 40.0 m + (1/2)(9.8 m/s²)(16.0 s²)
d = 40.0 m + 78.4 m
d = 118.4 m

Therefore, the plate travels a distance of 118.4 meters before it hits the ground.