How much does the amplitude of the waves increase between an earthquake that measures 4.2 on the Richter scale and an earth quake that measures 6.2 on the Richter scale?

To understand how much the amplitude of the waves increases between two earthquakes on the Richter scale, we need to first understand the Richter scale itself. The Richter scale is a logarithmic scale that measures the energy released by an earthquake. Each whole number increase on the Richter scale represents a tenfold increase in the amplitude of the seismic waves and approximately 31.6 times more energy release.

In your question, we are comparing an earthquake measuring 4.2 on the Richter scale with one measuring 6.2. The difference in the Richter scale value between the two earthquakes is 6.2 - 4.2 = 2.

Since each whole number increase on the Richter scale represents a tenfold increase in amplitude, we can calculate the difference in amplitude using the formula:

Amplitude increase = 10 ^ (difference in Richter scale value)

Amplitude increase = 10 ^ 2

Amplitude increase = 100

Therefore, the amplitude of the waves between the two earthquakes increases by a factor of 100.