My watch is 1 second fast each hour and my friend's watch is 1.5 seconds slow each hour. Right now they show the same time. When will they show the same time again?

Who ever answers my ?, please tell me how.

DrWLS explained this answer in this post this afternoon.

http://www.jiskha.com/display.cgi?id=1194714819

To determine when the two watches will show the same time again, we need to find the smallest common multiple of the two watch errors.

Let's break it down step by step:

1. Convert the watch errors to seconds per minute:
The first watch gains 1 second every hour, which is equivalent to (1/60) seconds per minute.
The second watch loses 1.5 seconds every hour, which is equivalent to (-1.5/60) seconds per minute.

2. Find the least common multiple (LCM) of the two error rates:
The LCM is the smallest number that is divisible by both error rates.

To find the LCM, we can use prime factorization:
- The error rate for the first watch is 1/60 seconds per minute.
> Breaking it down into prime factors: 1/60 = (1/2) * (1/2) * (1/3) * (1/5).
- The error rate for the second watch is -1.5/60 seconds per minute.
> Breaking it down into prime factors: -1.5/60 = (-1/2) * (-1/3) * (1/5).

Now, we take all the unique prime factors (including the negative sign) and multiply them together:
(1/2) * (1/2) * (1/3) * (1/5) * (-1/2) * (-1/3) * (1/5) = -1/60

Thus, the LCM of the two error rates is 60 minutes.

3. Now, we convert the LCM from minutes to hours:
Since there are 60 minutes in an hour, the LCM of 60 minutes is equal to 1 hour.

Therefore, the two watches will show the same time again in exactly 1 hour from the current time.