An airplane climbs 100 feet during the first second of takeoff. In each succeeding second it climbs 100 feet more than during the previous second. How many seconds does it take for the plane to reach an altitude of 12000 feet?

100s =12000

Solve for s.

To find the number of seconds it takes for the plane to reach an altitude of 12000 feet, we need to determine the pattern of its climb.

In the given scenario, during the first second, the plane climbs 100 feet. In each succeeding second, it climbs 100 feet more than during the previous second.

This can be represented as an arithmetic sequence, where the first term (a₁) is 100 feet, and the common difference (d) is also 100 feet. The general formula for an arithmetic sequence is:

aₙ = a₁ + (n - 1) * d

Here, aₙ represents the nth term, a₁ is the first term, n is the number of terms, and d is the common difference.

We want to find the number of seconds (n) when the altitude (aₙ) reaches 12000 feet. Now we can set up the equation:

12000 = 100 + (n - 1) * 100

Simplifying the equation, we have:

12000 = 100 + 100n - 100

Combining like terms:

12000 = 100n

Now, let's solve for n by isolating it:

100n = 12000

Dividing both sides of the equation by 100:

n = 12000 / 100

Simplifying:

n = 120

Therefore, it will take the plane 120 seconds to reach an altitude of 12000 feet.