A spherical source radiates sound uniformly in all directions. At a distance of 12 m, the sound intensity level is 100 dB. At what distance from the source is the intensity level 40 dB?

When I set it up as 100dB/40dB = r^2/(12m)^2, and got x = 18.97 meters, i was told this answer was incorrect.

To solve this problem, we can use the inverse square law, which states that the intensity (I) of sound decreases with the square of the distance (r) from the source.

The formula for calculating the change in dB level given a change in intensity is:

ΔdB = 10 * log10(I2/I1)

Where ΔdB is the difference in decibels, I1 is the initial intensity, and I2 is the final intensity.

In this case, we know that the initial intensity level is 100 dB and the final intensity level is 40 dB. We want to find the distance (r) at which the intensity level drops to 40 dB.

First, let's calculate the ratio of the final intensity (I2) to the initial intensity (I1):

Ratio = 10^((40 - 100)/10) = 10^(-60/10) = 10^(-6) = 1/1,000,000

Now, let's find the square root of this ratio to get the distance ratio:

Distance ratio = √(1/1,000,000) = 1/1,000

To find the distance (r) at which the intensity level is 40 dB, we multiply the distance ratio by the known initial distance of 12 m:

r = (1/1,000) * 12 m = 0.012 m = 0.012 * 100 cm = 1.2 cm

Therefore, at a distance of 1.2 cm from the source, the sound intensity level would be 40 dB.

It seems like there might have been a calculation error in your previous attempt. Double-check your calculations to ensure accuracy.