An object is launched upward from the ground with an initial vertical velocity of 40 feet per second. After how many seconds does the object reach a height of 25 feet? I need to use the vertical motion model. Here is my work so far.

h = -17t^2 + 40t + 25

I meant -16t^2

your opening function should be

h = -16t^2 + 40t
h is supposed to have a value of 25
25 = -16t^2 + 40t
16t^2 - 40t + 25 = 0 , looks like a perfect square
(4t - 5)^2 = 0
4t = 5
t = 5/4

it will take 5/4 seconds

I don't know what "vertical motion model" means, never heard that term

To find the time it takes for the object to reach a height of 25 feet, you can use the given equation for vertical motion:

h = -17t^2 + 40t + 25

In this equation, h represents the height in feet and t represents the time in seconds.

To solve for the time when the height is 25 feet, you can set h equal to 25:

25 = -17t^2 + 40t + 25

Now, let's solve for t. Start by subtracting 25 from both sides of the equation to get:

0 = -17t^2 + 40t

Rearrange the equation to put it in standard quadratic form:

17t^2 - 40t = 0

To solve this quadratic equation, you can factor out a common factor of t:

t(17t - 40) = 0

For the equation to be satisfied, either t = 0 or 17t - 40 = 0.

The first solution, t = 0, represents the time when the object is initially launched from the ground, so we can ignore it.

To solve the second equation, 17t - 40 = 0, add 40 to both sides to get:

17t = 40

Then, divide both sides by 17 to solve for t:

t = 40/17

So, the object will reach a height of 25 feet after approximately 2.35 seconds, which can be calculated by dividing 40 by 17.