A clock loses 10 minutes each day. How many days will it take to reach a point where the clock will indicate the right time?

Required Day, D = 12/h.

Here, 10 min = 10/60 h = 1/6 h.

Therefore, D = 12/(1/6) = 72 days

As we know that: 1hr =60min

Loses 10min each day :10/60=1/6
So, 1/6*?= 12
?=x
x= 72

10 min = 1/6 hour per day

1/6 hour / day * x days = 12 hours
x = 6*12 = 72 days

To determine how many days it will take for the clock to indicate the right time, we need to calculate the time difference between the clock's current time and the right time.

If the clock loses 10 minutes every day, it means that at the end of each day, the clock is behind by 10 minutes. So, if we let x represent the number of days it takes for the clock to display the correct time, the total time difference is 10 minutes multiplied by the number of days, which can be expressed as 10x.

To find the right time, we also need to consider the number of days the clock is behind. Since each day the clock loses 10 minutes, the total time the clock is behind is x days multiplied by 10 minutes, which can be expressed as 10x.

To reach the right time, these two time differences should be equal. So, we can set up the following equation:

10x = 10x

Solving this equation, we find that x can be any value. This means that the clock will never reach a point where it will indicate the exact right time.

Therefore, the answer is that the clock will never show the right time, no matter how many days pass.