rajesh a money lender lends in the following manner. for the first year he charges 2% p.a, for the second year he charges 4% p.a, for the third year he charges 6% p.a and so on. if he lends a sum in this way at simple interest find the least integral number of years in which it will fetch an interestat least equal to itself

To find the least integral number of years in which the interest will be at least equal to the sum lent, we can calculate the interest earned for each year until it meets or exceeds the sum lent.

Let's assume the sum lent is P.

In the first year, the interest is calculated at a rate of 2%, so the interest earned is 2% of P, which is 0.02P.

In the second year, the interest is calculated at a rate of 4%, so the interest earned is 4% of (P + 0.02P), which is 0.04 * 1.02P.

In the third year, the interest is calculated at a rate of 6%, so the interest earned is 6% of (P + 0.02P + 0.04 * 1.02P), which is 0.06 * (1.06P).

We can observe that the pattern of interest earned per year is increasing by 2% each year. This means that the interest earned in the nth year is given by (2n / 100) * (1 + 0.02 + 0.04 + ... + 0.02(n-1)) * P.

Now, let's set up an equation to find the least integral number of years:

P + 0.02P + 0.04 * 1.02P + 0.06 * (1.06P) + ... >= P

Simplifying the equation, we get:

1P * (1 + 0.02 + 0.04 + ... + 0.02(n-1)) >= P

Using the formula for the sum of an arithmetic series, we have:

(1 + 0.02 + 0.04 + ... + 0.02(n-1)) = n(1 + (n-1)/2 * 0.02)

Now, let's solve the equation:

n(1 + (n-1)/2 * 0.02) >= 1

Divide both sides by n:

1 + (n-1)/2 * 0.02 >= 1/n

Simplify the inequality:

(n-1)/2 * 0.02 >= 1/n - 1

(n-1)/2 * 0.02 >= (1 - n) / n

Cross-multiplying, we get:

0.02n - 0.02 >= 1 - n

0.02n + n >= 1 + 0.02

1.02n >= 1.02

Divide both sides by 1.02:

n >= 1

Since n must be an integer, the smallest integral number of years required for the interest earned to be at least equal to the sum lent is 2.

Therefore, the answer is 2 years.