Post a New Question

math

posted by on .

I read from my textbook:

If S is the infinite series 1 + x + x^2 + x^3 + ...
Then Sx = x + x^2 + x^3 + x^4 + ... = S - 1
So, S = 1/(1-x)

I follow what that logic, but it still doesn't make sense.
The way I see it, if you plug any real number > 1 into x, S will be infinity which does not equal 1/(1-x) at all...

For example, if I plug the constant 10 in for x,
The infinite series "1 + x + x^2 + x^3 + ..." will be infinity
yet 1/(1-x) will equal -1/9.

Can someone explain this?

  • math - ,

    So if x>=1, both series do not converge, but diverge, and head for greater sums. But if x<1, they converge.

    Now, if x<1, does S=1/(1-x) ?

  • math - ,

    If S is the infinite series 1 + x + x^2 + x^3 + ...
    Then Sx = x + x^2 + x^3 + x^4 + ... = S - 1
    well, say you terminate at 4 terms as you did:
    xS = x + x^2 + x^3 + x^4
    S =1+x + x^2 + x^3
    then
    xS-S = x^4 -1
    and
    S (x-1) = (x^4-1)
    S = (x^4-1) / (x-1)
    which is what you had except I have that x^4
    No matter how many terms you take, xS will always have a sum bigger than S by that x^n at the end so it is not just the -1

  • math - ,

    By the way, if x is <1, then that x^n at the end will go to zero so it will all work.

  • math - ,

    Thanks guys. I wrote a simple computer program that verifies that S = 1/(1-x) holds when 0 < x < 1 (hence the series converges), but not when x > 1 (and the series diverges). That wasn't clear in the textbook. Thanks for the help.

Answer This Question

First Name:
School Subject:
Answer:

Related Questions

More Related Questions

Post a New Question