An explosion occurs that releases 2500J of energy in 110ms, 1.5% of which goes into a spherical sound wave traveling out equally in all directions from the explosion. A sound level meter records the average value of the acoustic wave as it passes the meter whcich is 25m from the center of the explosion. What was the average dB level recorded by the meter? For simplicity you may assume that the loss of acoustic power is negligible as the wave travels the distance from the explosion to the meter.


I'm redoing this problem because I didn't get full credit on it. I know that I have to use the formulas I=P/4ðr^2 and dB=10*log(I/Io). The way I did it was I said that P=.015*2500 but I guess this is wrong. My teacher marked that P should equal the change of energy over the change of x but im not sure how to get it. Any help would be greatly appreciated.

Pressure = energy density

Power= Energy/time= .015*2500j/110ms
= xxxxx watts.

Now, I= Power/area