A computer depreciates (loses its value) at a rate of 50% the original value every 2 years. If the computer now costs $75 after 6 years, how much did it cost when you bought it new?

Do it the same way as your post above this one.

now --- 75
2 yrs ago --150
etc

To find out how much the computer originally cost, we can utilize the information given regarding its depreciation rate and the current price.

Let's break down the problem step by step:

1. Let's assume the original cost of the computer as "X" (in dollars).
2. After 2 years, the computer loses 50% of its value, so its price will be 50% of X, which is 0.5X.
3. After another 2 years (4 years in total), the computer will lose an additional 50% of its value. So, the price will be 50% of 0.5X, which is 0.25X.
4. Finally, after 6 years, the computer's price is given as $75.

Visually, the timeline would look like this:

Year 0: X (original cost)
Year 2: 0.5X
Year 4: 0.25X
Year 6: $75

Now, we need to find the relationship between the given price at year 6 ($75) and the original cost (X).

From Year 0 to Year 6, each step halves the previous price. Therefore, we can calculate:

0.25X = $75

To solve for X, we need to isolate it. We can do this by dividing both sides of the equation by 0.25:

X = $75 / 0.25

Now, let's calculate X:

X = $300

Therefore, the computer originally cost $300.