By international agreement, the nautical mile is now defined as exactly 1852 meters. By what percentage does this current definition differ from the original definition?

Define original. My Bowditch (1966) defined the nautical mile as the British "sea mile", or 1852.3m

There were others before that. http://en.wikipedia.org/wiki/Nautical_mile

To determine the percentage difference between the current definition of a nautical mile (1852 meters) and the original definition, we need to calculate the absolute difference between the two values and express it as a percentage of the original definition.

The original definition of a nautical mile was based on the circumference of the Earth and varied slightly depending on the measurement method used. However, a commonly accepted approximation for historical purposes is 1 nautical mile being equivalent to 1 arc minute of latitude along any meridian.

To calculate the percentage difference:

1. Find the difference between the original and current definitions:
Difference = Original Definition – Current Definition

Due to the given information, the original definition is not explicitly stated. So, we will use the approximation of 1 nautical mile being equivalent to 1 arc minute of latitude.

Based on this approximation, 1 nautical mile would be approximately equal to 1852 * 60 = 111,120 meters.

Difference = 111,120 meters - 1852 meters

2. Calculate the percentage difference:
Percentage Difference = (Difference / Original Definition) * 100

Percentage Difference = ( 111,120 meters - 1852 meters) / 111,120 meters * 100

Now simply calculate this percentage to get the answer.