Post a New Question

posted by SC on Wednesday, January 6, 2010 at 8:24pm.

If a ball is throw into the air with a velocity of 40ft/s its height in feet t seconds later is given by: y=40t-16t^2. Find the average velocity for the time period beginning when t=2 and lasting 0.5 seconds.

I assume the ball is thrown straight up, so the velocity only has 1 dimensional components. The average velocity is the same as the average slope over the given time period. m = (y1 - y2) / (x1 - x2) average V = [y(2) - y(2.5)] / (2 - 2.5)

ohh i see it now! thank you!

More Related Questions