Use the Integral Test to determine whether the series ∑∞,n=1 1/(n^2+n^3) is convergent. (First make sure to check that the conditions of the Integral Test are satisfied.)

To use the Integral Test to determine the convergence of a series, we need to check two conditions:

1. The terms of the series are positive: In this case, the terms of the series are 1/(n^2 + n^3) which is positive for all n > 0.

2. The function f(x) = 1/(x^2 + x^3) is continuous, positive, and decreasing on the interval [1, ∞): To check this, we can take the derivative of f(x) and see if it's negative.

Let's calculate the derivative of f(x):

f'(x) = d/dx (1/(x^2 + x^3))
= -1/(x^2 + x^3)^2 * (2x + 3x^2)

Now we need to find the critical points of f(x) by finding where f'(x) = 0:

-1/(x^2 + x^3)^2 * (2x + 3x^2) = 0

From this equation, we can see that the only possible critical point is x = 0. However, since x > 1, the function is always positive and decreasing on the interval [1, ∞).

Now, we can apply the Integral Test:

∫[1, ∞] 1/(x^2 + x^3) dx

To evaluate this integral, we can use partial fraction decomposition. The denominator x^2 + x^3 factors to x^2(1 + x), so we can write:

1/(x^2 + x^3) = A/x + B/(1 + x)

Multiplying both sides by x(x+1), we get:

1 = A(1 + x) + Bx

Simplifying further:

1 = (A + B) + (Ax + Bx)
1 = A + B + Ax + Bx

Comparing coefficients:

A + B = 0 (coefficient of constant term on the right is 0)
A + B = 1 (coefficient of x term on the right is 1)

From these equations, we find A = -1 and B = 1.

Thus, the original integral becomes:

∫[1, ∞] (-1/x + 1/(1 + x)) dx

Taking the antiderivative:

-∫[1, ∞] 1/x dx + ∫[1, ∞] 1/(1 + x) dx
= -ln|x| + ln|1 + x| |[1, ∞]

Evaluating the limits:

-ln|∞| + ln|1 + ∞| - (-ln|1| + ln|1 + 1|)
= -ln|1 + ∞| + ln|2| - (-ln|1| + ln|2|)
= -ln(∞) + ln(2) + ln(2)
= -∞ + ln(2) + ln(2)
= -∞

Since the integral is divergent (equals -∞), the series ∑∞,n=1 1/(n^2+n^3) is also divergent based on the Integral Test.