Using the integral test, how do you show whether ##sum 1/sqrt(n+1)## diverges or converges from n=1 to infinity?

Using the integral test, how do you show whether ##sum 1/sqrt(n+1)## diverges or converges from n=1 to infinity?
The integral test basically works from the definition of the integral (quick version: the integral is the accumulated sum of infinitely thin differential intervals ##dn## over a specified interval ##a->b##).

A paraphrased version of the integral test is as follows:

Let there be a function ##f(n) = a_n## where ##a_n## is a series lying within the domain ##[k,oo)##. There exists another function ##f(x)## that is continuous, positive, and decreasing such that the convergence or divergence of ##int_k^(oo)f(x)dx## determines the convergence or divergence of ##sum_(n=k)^(oo)a_n##, respectively.

So, essentially, we have to integrate this, which is indeed continuous, positive, and decreasing at ##[1,oo)##:

int_1^(oo)1/(sqrt(x+1))dx

Order from us and get better grades. We are the service you have been looking for.