Cauchy condensation test

In mathematics, the Cauchy condensation test, named after Augustin-Louis Cauchy, is a standard convergence test for infinite series. For a non-increasing sequence of non-negative real numbers, the series converges if and only if the "condensed" series converges. Moreover, if they converge, the sum of the condensed series is no more than twice as large as the sum of the original.

Estimate

The Cauchy condensation test follows from the stronger estimate, which should be understood as an inequality of extended real numbers. The essential thrust of a proof follows, patterned after Oresme's proof of the divergence of the harmonic series.

To see the first inequality, the terms of the original series are rebracketed into runs whose lengths are powers of two, and then each run is bounded above by replacing each term by the largest term in that run. That term is always the first one, since by assumption the terms are non-increasing.

To see the second inequality, these two series are again rebracketed into runs of power of two length, but "offset" as shown below, so that the run of which begins with lines up with the end of the run of which ends with , so that the former stays always "ahead" of the latter.

Visualization of the above argument. Partial sums of the series , , and are shown overlaid from left to right.

Integral comparison

The "condensation" transformation recalls the integral variable substitution yielding .

Pursuing this idea, the integral test for convergence gives us, in the case of monotone , that converges if and only if converges. The substitution yields the integral . We then notice that , where the right hand side comes from applying the integral test to the condensed series . Therefore, converges if and only if converges.

Examples

The test can be useful for series where n appears as in a denominator in f. For the most basic example of this sort, the harmonic series is transformed into the series , which clearly diverges.

As a more complex example, take

Here the series definitely converges for a > 1, and diverges for a < 1. When a = 1, the condensation transformation gives the series

The logarithms "shift to the left". So when a = 1, we have convergence for b > 1, divergence for b < 1. When b = 1 the value of c enters.

This result readily generalizes: the condensation test, applied repeatedly, can be used to show that for , the generalized Bertrand series converges for and diverges for .[1] Here denotes the mth iterate of a function , so that The lower limit of the sum, , was chosen so that all terms of the series are positive. Notably, these series provide examples of infinite sums that converge or diverge arbitrarily slowly. For instance, in the case of and , the partial sum exceeds 10 only after (a googolplex) terms; yet the series diverges nevertheless.

Schlömilch's generalization

A generalization of the condensation test was given by Oskar Schlömilch.[2] Let u(n) be a strictly increasing sequence of positive integers such that the ratio of successive differences is bounded: there is a positive real number N, for which

Then, provided that meets the same preconditions as in Cauchy's convergence test, the convergence of the series is equivalent to the convergence of

Taking so that , the Cauchy condensation test emerges as a special case.

References

  1. ^ Rudin, Walter (1976). Principles of Mathematical Analysis. New York: McGraw-Hill. pp. 62–63. ISBN 0-07-054235-X.
  2. ^ Elijah Liflyand, Sergey Tikhonov, & Maria Zeltse (2012) Extending tests for convergence of number series page 7/28 via Brandeis University
  • Bonar, Khoury (2006). Real Infinite Series. Mathematical Association of America. ISBN 0-88385-745-6.