The speed of sound in air (m/s) depends on temperature according to aprox expression of

v= 331.5 + 0.607 Tc

Tc= temp in celcius

In dry air, temperature decreases about 1deg celcius for every 150m rise in altitude.

a) assume the change is constant up to altitude of 9000 m.
What time interval is required for sound from airplane flying at 9,000m to reach ground on day when ground temperature is 30 deg celcius.


I tried to make a equation to do the problem and came up with

v= 331.5 + .607 (Tc- d/150)

and then I was thinking of plugging in
for

t= d/v after finding v

b) compare your answer with time interval required if air wre uniform at 30 deg celcius. Which is longer?

I think I would just plug into original equation 30 deg celcius

v= 331.5+ .607Tc
and then put v into t= d/v as well then compare them.

Is this alright?

Thanks

When flying at 9000 m, the sound speed at altitude is 331.5 - 0.607*(30-9000/150) = 331.5 - 18.2 = 331.5 m/s. On the ground the sound speed is 331.5+.607*30 = 349.7 m/s. The average speed of sound for a sound wave from plane to ground, when Tground - 30 C, is

(331.5 + 349.7)/2 = 340.6 m/s.
The answer is 9000 m/340.6 m/s= 26.42 s

For (b), forget about the decreasing T and sound speed with altitude. The sound speed all the way is
331.5 + 30*.607 = 349.7 m/s

The sound will get to the ground in less time in this case

Time T = integral of dt = Integral of dh/v from height h1 to height h2.

If v is a linear function of the height, you get the formula:

T = (h2 - h1)/(v2-v1) Ln(v1/v2)

In the case of this problem:

v1 = 349.71 m/s

v2 = 313.29 m/s

h1 = 0 m

h2 = 9000 m

So, T = 27.18 s

I'm confused as to why the book says it takes 27.2 s for the wave to reach the ground for part a)

Thanks

And I'm also confused as to why you subtracted when the equation is addition?

v= 331.5 + .607 (Tc- d/150)

Thanks drwls

I wrote two negatives by mistake at one point. The temperature at flight altitude is -30 C, so the final speed of sound should be right. If I don't agree with the book's answer, I may have made another error in math somewhere. Sometimes the books are wrong.

Count Iblis has found my mistake. I should have taken the average velocity to get time. Velocity appears in the denominator when you calculate elapsed time. In integrating to get elapsed time, you get a log term.

Thanks very much drwls and Count Ibis =D

Yes, you're on the right track. Let me break down how to solve this problem step by step.

First, let's tackle part a).

We know that the speed of sound in air depends on temperature, given by the equation v = 331.5 + 0.607Tc, where Tc is the temperature in Celsius.

We also know that the temperature decreases by 1 degree Celsius for every 150m rise in altitude. So, assuming a constant decrease, at an altitude of 9000m, the temperature would be lowered by (9000/150) = 60 degrees Celsius.

Now, we have the ground temperature, which is 30 degrees Celsius, and the change in temperature at 9000m, which is -60 degrees Celsius. We can substitute these values into the equation v = 331.5 + 0.607(Tc - d/150), where d represents the change in temperature at a given altitude.

So, for an altitude of 9000m, we have:

v = 331.5 + 0.607(30 - (-60))
= 331.5 + 0.607(90)
= 331.5 + 54.63
= 386.13 m/s

Now, to find the time interval required for sound from an airplane flying at 9000m to reach the ground, we can use the equation t = d/v, where d is the distance traveled by sound. In this case, the distance traveled is 9000m.

t = 9000 / 386.13
= 23.3 seconds (rounded to one decimal place)

Therefore, it would take approximately 23.3 seconds for the sound from the airplane flying at 9000m to reach the ground on a day when the ground temperature is 30 degrees Celsius.

Moving on to part b),

To compare the time interval required if the air were uniform at 30 degrees Celsius, we can use the original equation v = 331.5 + 0.607Tc. Plugging in Tc = 30 degrees Celsius:

v = 331.5 + 0.607(30)
= 331.5 + 18.21
= 349.71 m/s

Now, we can use the same formula t = d/v, with d = 9000m:

t = 9000 / 349.71
= 25.7 seconds (rounded to one decimal place)

Therefore, it would take approximately 25.7 seconds for the sound from the airplane flying at 9000m to reach the ground if the air were uniformly at 30 degrees Celsius.

Comparing the two time intervals, we find that the time interval is shorter when the temperature decreases with altitude (23.3 seconds) compared to when the air is uniformly at 30 degrees Celsius (25.7 seconds).

I hope this explanation helps! Let me know if you have any further questions.

I'm sorry. I know this question was asked a while ago, but I'm having trouble figuring out how you did your integration of dt

T = integral of dt = integral of dh/v from h1 to h2 (from 0 to 9000m)
did you use substitution dh = t dv?
Thanks!