How much more energy is released in a 5.0 magnitude earthquake than a 4.0 magnitude earthquake?

I know that a tenfold increase in wave amplitude equals an increase of 1 on the scale so 1 to 2 = 10times

so does that mean that 2 to 3 would be 10 more times?

To compare the energy released in different magnitude earthquakes, it's important to understand that the Richter Scale uses a logarithmic scale. This means that a one-point increase in magnitude corresponds to a tenfold increase in the amplitude of seismic waves and roughly a 31.6 times increase in the energy released.

In your case, you are comparing a 5.0 magnitude earthquake with a 4.0 magnitude earthquake. Following the logarithmic scale, the energy released in a 5.0 magnitude earthquake would be approximately 10 times greater than that of a 4.0 magnitude earthquake.

To determine the exact difference in energy, you can calculate it using the formula:
Energy ratio = 10 ^ (1.5 * (magnitude difference))

For your example, the magnitude difference is 5.0 - 4.0 = 1. Plugging this into the formula, we get:
Energy ratio = 10 ^ (1.5 * 1) = 10 ^ 1.5 ≈ 31.6

Therefore, a 5.0 magnitude earthquake releases roughly 31.6 times more energy than a 4.0 magnitude earthquake.