I have been working on this problem but I am not sure I am doing it right.

During takeoff an airplane's angle of ascent is 18 degrees and its speed is 275 feet per second.

A. Find the plane's altitude after 1 minute.
B. How long will it take for the plane to climb to an altitude of 10,000 feet?

For a. 275 Cos18 and I got an altitude of 261.5
For b. I got 38 minutes

In order to work out the altitude of the plane after one second it is 275sine(18) (sine rather than cosine, as sine uses the opposite side divided by the hypotenuse), which is 84.98 feet. Therefore, after one minute it is 84.98 x 60, which is 5098.78 feet

for part b i got 1 minute and 57.68 seconds
(10000/sin18) divided by 275 to find the number of seconds

Let's break down each part of the problem and go through the steps to solve it correctly. Starting with part A:

To find the plane's altitude after 1 minute, you need to use trigonometry and the given angle of ascent. Here's how to do it:

Step 1: Convert the given angle from degrees to radians.
To convert degrees to radians, remember that π radians is equal to 180 degrees. Divide the given angle (18 degrees) by 180 and then multiply by π to obtain the equivalent angle in radians.

18 degrees = (18/180) * π = 0.1π radians

Step 2: Use trigonometry to find the vertical component of the plane's speed.
The vertical component can be found by multiplying the total speed (275 feet per second) by the cosine of the given angle (18 degrees), which you've correctly done. So the vertical component is:

275 * cos(18 degrees) = 275 * cos(0.1π radians) ≈ 261.5 feet per second

Step 3: Multiply the vertical component of the speed by the time elapsed (1 minute = 60 seconds) to find the altitude.

Altitude = Vertical component of speed * Time elapsed
Altitude = 261.5 feet per second * 60 seconds = 15,690 feet

Therefore, the correct answer for part A is that the plane's altitude after 1 minute is 15,690 feet.

Now let's move on to part B:

To determine how long it takes for the plane to climb to an altitude of 10,000 feet, we need to use similar concepts and calculations.

Step 1: Set up the trigonometric equation based on the given information.
We know that the vertical displacement (change in altitude) is 10,000 feet. We need to find the time it takes for the plane to reach this altitude. Using trigonometry, we can set up the equation:

Vertical displacement = Vertical component of speed * Time elapsed

10,000 feet = 275 * cos(18 degrees) * Time elapsed

Step 2: Solve for Time elapsed.
Divide both sides of the equation by 275 * cos(18 degrees) to isolate the Time elapsed:

Time elapsed = 10,000 feet / (275 * cos(18 degrees))

Calculating this expression, we find:

Time elapsed ≈ 37.52 seconds

Therefore, the correct answer for part B is that it will take approximately 37.52 seconds for the plane to climb to an altitude of 10,000 feet.

It seems there might have been some errors in your previous calculations. By following these steps, you should arrive at the correct answers.

Let's break down the problem step by step to verify your calculations:

A. Finding the plane's altitude after 1 minute:
To find the altitude after 1 minute, we need to calculate the vertical component of the plane's velocity. This can be done by finding the sine of the angle of ascent and multiplying it by the speed.

Vertical component of velocity = speed × sin(angle of ascent)
Vertical component of velocity = 275 × sin(18°)

Calculating this value:
Vertical component of velocity = 275 × 0.309
Vertical component of velocity ≈ 85.2 feet per second

To find the altitude after 1 minute, we multiply the vertical component of velocity by 60 (since there are 60 seconds in 1 minute):

Altitude = Vertical component of velocity × time
Altitude = 85.2 × 60
Altitude ≈ 5112 feet

Therefore, after 1 minute, the plane's altitude would be approximately 5112 feet.

B. Finding the time it takes for the plane to climb to an altitude of 10,000 feet:
To find the time it takes for the plane to reach 10,000 feet, we need to rearrange the formula we used in part A to solve for time:

Altitude = Vertical component of velocity × time

Rearranging the formula:
Time = Altitude / Vertical component of velocity

Substituting the given values:
Time = 10000 / 85.2
Time ≈ 117.39 seconds

Since there are 60 seconds in 1 minute, we can convert the time to minutes by dividing by 60:

Time ≈ 117.39 / 60
Time ≈ 1.96 minutes

Therefore, it will take approximately 1.96 minutes (or about 2 minutes) for the plane to climb to an altitude of 10,000 feet.

So, your answer for part A is correct, and for part B, it would take approximately 2 minutes for the plane to climb to an altitude of 10,000 feet.