The owner of a

rental house can depreciate its value over a period of 27 1/2
years, meaning that the value of the house declines
at an even rate over that period of time until the value
is $0.
My question is by what fraction would the value of the house depreciate the first year and
If the house is judged at 85,000, what is the value of the first year's depreciation?

If you want the house to depreciate to zero, there has to be a write-off value below which the house is deemed to be zero.

Mathematically, at a constant rate of depreciation, even at 10% per year, we will never get rid of the value of the house, even in a hundred years, because there is always 90% of what's left from last year.

If you draw the graph of the value of the house against time, the value decreases gradually but very slowly towards, but never quite, zero.

The x-axis (where y=0) is called a horizontal asymptote.

Wow, then why was that even asked as a problem?

Also, what about the second part of my question? If the house is judged at 85,000, what is the value of the first year's depreciation?

It depends on the rate of depreciation, which means that if an answer is not found for the first part, the second part cannot be answered.

Try this:
If the house depreciates at 10% a year, every year the value of the house is 0.9 of the previous year's value.
After 100 years, the value of the house is
85000*0.9100
=$2.26
Still not quite zero.
The key is the word "even rate". If your teacher interpretes it differently, for example, the same amount every year, then it can go down to zero after a certain number of years. But it says "rate", which is a proportion.
Sorry, I cannot go further than this. Perhaps someone else may have other ideas.

I understand. Thank you so much.

I just reread the question again and it seems to me more a tax-related problem than math related problem.

This may mean that the depreciation permitted (by the IRS) is uniformly 1/27.5 of its present value every year.

So the annual depreciation (fixed) amount is 85000/27.5= is 3091 and after 27.5 years, the rental house has no more residual value.

During this period, the owner can claim $3091 every year as part of the investment expenses, and so this amount will be considered deductible from taxable income from other sources. However, his will have to pay dearly at the end when he sells the house, because the full sale price will be or can be considered capital gains, and so taxable.

It's always the same saying, you pay now, or you pay later!

So to make a long story short, if my interpretation is correct, then the fraction of the cost is 1/27.5 for the first year, and the first (and subsequent) year amount is $3091.

To find the fraction by which the value of the house depreciates in the first year, we need to divide the total depreciation period (27 1/2 years) by the number of years in the first period (1 year).

In this case, the total depreciation period is 27 1/2 years. Since we want to find the fraction for the first year, the number of years in the first period is 1.

To divide fractions, we multiply the first fraction by the reciprocal (or the inverse) of the second fraction. In this case, we have:

Fraction = Total Depreciation Period / Number of Years in First Period
= 27 1/2 / 1

To make the calculation easier, we can convert the mixed number to an improper fraction:

Fraction = (27 * 2 + 1) / 1
= 55 / 1

Now, we can simplify the fraction by dividing the numerator by the denominator:

Fraction = 55 / 1
= 55

Therefore, the value of the house depreciates by a fraction of 55 in the first year.

To find the value of the first year's depreciation, we need to multiply the fraction of depreciation by the initial value of the house.

Value of First Year's Depreciation = Fraction of Depreciation * Initial Value of the House
= 55 * $85,000

To calculate this, we multiply 55 by $85,000:

Value of First Year's Depreciation = $4,675,000

Therefore, the value of the first year's depreciation is $4,675,000.