If the distance stays the same but they time increases what happens to the speed?

Doesn't the speed decrease?

If I drive 100 miles in 2 hours, my average speed is 50 mph

But if I drive 100 miles in 2.5 hours, my average speed is 40 mph.

If the distance remains constant while the time increases, it means that the object is taking longer to cover the same distance. In this scenario, the speed of the object decreases.

To understand this relationship, you can use the formula for speed:

Speed = Distance / Time

Let's say the distance is 100 meters and initially, it takes 10 seconds to cover this distance, giving us a speed of:

Speed = 100m / 10s = 10 m/s

Now, if the time increases to 20 seconds, with the same distance of 100 meters, the speed will be:

Speed = 100m / 20s = 5 m/s

As you can see, when the time increases while the distance remains the same, the speed decreases. This happens because the object is taking longer to cover the distance, resulting in a lower speed.