help please this question is driving my crazy.

Suppose there is a 2-degree drop in a temperature for every thousand feet that an
airplanes climbs into sky. If the temperature on the ground is 65 degrees, what would be
the temperature when the plane reaches an altitude of 23,000 ft?

23000/1000 = 23

23*2 = 46 degree-drop
therefore,
65-46 = 19 degrees

so there,, :)

after diving 90m below the sea level, a diver rises at a rate of 4 meters per minute for 8 min. Where is the diver in relation to the surface

distance = rate*time

distance = 4*8 = 32 meters
therefore,
90-32 = 58 meters BELOW sea level

so there,, :)

To solve this problem, you need to calculate the temperature change for each thousand feet the airplane climbs and then multiply that by the number of thousand feet the airplane climbs.

Given that there is a 2-degree drop in temperature for every thousand feet the airplane climbs, we can set up a proportion to solve for the temperature change.

The proportion can be set up as follows:
2 degrees / 1000 ft = x degrees / 23000 ft.

To solve for x, cross multiply:
2 degrees * 23000 ft = 1000 ft * x degrees.

Simplifying, we have:
46000 degrees-ft = 1000 ft * x degrees.

Divide both sides by 1000 ft to isolate x:
x degrees = 46000 degrees-ft / 1000 ft.

Simplifying, we find that x = 46 degrees.

Now, to find the temperature when the airplane reaches an altitude of 23,000 ft, subtract 46 degrees from the starting temperature of 65 degrees.

65 degrees - 46 degrees = 19 degrees.

Therefore, the temperature when the plane reaches an altitude of 23,000 ft would be 19 degrees.