3.10 Power Series
Power series
A power series is a series of functions in which the partial sums are polynomials, that is \(\sum_{n=0}^{\infty}a_{n}x^{n}\)
where the partial sums are given by
\(S_0 = a_0,\)
\(S_1 = a_0 + a_1x,\)
\(\vdots\)
\(S_{n}=a_{0}+a_{1}x+\dots+a_{n}x^{n}\) with \(a_k \in \mathbb{R}\).
We begin by proving a result that tells us that a power series always converges in an interval around \(x = 0\)
Theorem.
If a power series converges at a point \(x_0 \in \mathbb{R}\) , then it converges absolutely for any \(x \in \mathbb{R}\) , satisfying that \(|x| < |x_0|\).
Proof
Main idea: Bound \(a_nx^n\) and use comparison test.
Trick: \(\times 1\div 1\)
NTP: \(\sum_{n=0}^{\infty}a_{n}x^{n}\) is absolutely convergent for all \(x\in\R\) with \(|x|<|x_{0}|\xrightarrow{x_0\neq0}\frac{\left|x\right|}{\left|x_{0}\right|}<1\Rightarrow \left|\frac{x}{x_{0}}\right|<1\)
Consider \(\left|a_{n}x^{n}\right|=\left|\frac{a_{n}x^{n}}{x_{0}^{n}}\cdot x_{0}^{n}\right| =\left|a_{n}x_{0}^{n}\right|\left|\frac{x^{n}}{x_{0}^{n}}\right|<\left|a_{n}x_{0} ^{n}\right|\), then we need to prove \(a_nx^n_0\) is bounded
Since \(\sum_{n=0}^{\infty}a_{n}x^{n}\) converges, then \(a_nx^n\rightarrow 0\) when \(n\to\infty\)
\(\forall \varepsilon >0,\exists N\in\N\) s.t. \(|a_nx_0^n|<\varepsilon\) if \(n\geq N\), thus there exists \(M\in\R\) such that \(|a_nx_0^n|<M,\forall n\in \N\)
Then \(\left|a_{n}x^{n}\right|<\left|a_{n}x_{0}^{n}\right|<M\). And we know \(\sum_{n=0}^{\infty}M\) is convergent, then by comparison test
The series \(a_nx^n\) is absolutely convergent
Example
\(\sum^\infty_{n=0}\frac{x^n}{n}\) converges in \(x=-1\), by theorem it converges absolutely for \(|x|<1\)
Radius of convergence
Let \(\sum_{n=0}^{\infty} a_n x^n\) be a power series. The radius of convergence is a real number \(R \geq 0\) such that:
-
The series converges for all \(x \in \mathbb{R}\) with \(|x| < R\).
-
The series diverges for all \(x \in \mathbb{R}\) with \(|x| > R\).
There are two step to find the interval of radius of convergence. 1. find \(R\) 2. determine close or open
Theorem.
Let \(\sum_{n=0}^{\infty} a_n x^n\) be a power series with \(a_n \neq 0\) for all \(n \geq 0\).
Suppose that the limit \(\lim_{n \to \infty} \left| \frac{a_{n+1}}{a_n} \right| = \ell\) exists. If \(\ell \neq 0\), then \(R = \frac{1}{\ell} = \lim_{n \to \infty} \left| \frac{a_n}{a_{n+1}} \right|.\)
- If \(\ell = 0\), then the series converges for all \(x \in \mathbb{R}\). In this case, we say that \(R = \infty\).
- If \(\ell = \infty\), then the series converges only for \(x = 0\).
Proof
Fix \(x\in\R\). then we consider \(\sum^\infty_{n=0}a_nx^n\) as a numerical series and \(a_nx^n=b_n\)
By D'Alambert criterium, the series converges if \(\lim_{n\to\infty}\frac{\left|b_{n+1}\right|}{\left|b_{n}\right|}<1\)
But \(\lim_{n\to\infty}\left|\frac{a_{n+1}x^{n+1}}{a_{n}x^{n}}\right|=\left|x\right|\lim _{n\to\infty}\left|\frac{a_{n+1}}{a_{n}}\right|=|x|l\) by hypothesis that limit exists
Thus the series converges iff \(|x|l<1\Rightarrow\left|x\right|<\frac{1}{l}=R,l\neq0\) by definition
Hence \(R=\lim_{n\to\infty}\left|\frac{a_{n}}{a_{n+1}}\right|\), if \(l=0\), then \(R\to\infty\). If \(\ell = \infty\), then \(R\to 0\)
Example
-
\(\sum_{n=0}^{\infty}\frac{x^{n}}{n},a_{n}=\frac{1}{n}\)
\(R=\lim\frac{\frac{1}{n}}{\frac{1}{n+1}}=\lim\frac{n+1}{n}=1\), thus converges in \((-1,1)\)
If \(x=-1\), converges. If \(x=1\), diverges.
Thus converges in \([-1,1)\)
-
\(\sum_{n=1}^{\infty}\frac{x^{n}}{n!}\)
D'Alembert criterion: \(\lim_{n\to\infty}\left|\frac{\frac{x^{n+1}}{(n+1)!}}{\frac{x^{n}}{n!}}\right|=\lim _{n\to\infty}\left|x\cdot\frac{1}{n+1}\right|=|x|\cdot\lim_{n\to\infty}\frac{1}{n+1} <1\)
Then \(\sum_{n=1}^\infty \frac{x^n}{n!}\) converges for all \(x\). Thus \(R = \infty\). Thus converges for all \(x\in\R\)
-
\(\sum_{n=1}^{\infty}x^{n}n!\)
\(R=\lim\frac{n!}{\left(n+1\right)!}=\lim\frac{1}{1+n}=0\), thus converges in \(x=0\)
Theorem
Let \(\sum_{n=0}^{\infty}a_{n} x^{n}\) be a power series and assume that the limit \(\lim_{n \to \infty}\sqrt[n]{a_{n}}= \ell\) exists.
Then \(R =\begin{cases} \frac{1}{\ell}, & \text{if } \ell \neq 0, \\ \infty, & \text{if } \ell = 0. \end{cases}\)
Moreover, if \(\ell = \infty\), then the series converges only for \(x = 0\).
Proof
Consider \(\lim_{n\to\infty}\sqrt[n]{a_{n}x^{n}}=x\lim_{n\to\infty}\sqrt[n]{a_{n}}=xl<1\) by Cauchy roots test
If \(l=0\), then \(x\in \R\), then \(R=\infty\)
If \(l\neq 0\), then \(x<\frac{1}{l}\)
If \(l=\infty\), then \(\left|x\right|\leq0\Rightarrow x=0\)
Now that we know a way to compute the (open) set where a power series converges, let us study the properties of the function that is its limit.
Theorem
Let \(\sum_{n=0}^{\infty} a_n x^n\) be a power series. Assume that it converges absolutely at \(x = x_0\).
Then it converges uniformly on the closed interval \([-c, c]\), where \(c = |x_0|\).
Proof
Main idea: Note that \(\sum_{n=0}^{\infty}a_{n}x_{0}^{n}\) converges absolutely can be a bound, then Weierstrass M-test
We know that \(\sum_{n=0}^{\infty}a_{n}x_{0}^{n}\) converges absolutely, it means that \(\sum_{n=0}^{\infty}\left|a_{n}\right|\left|x_{0}^{n}\right|\) converges
Let \(M_{n}=|a_{n}||x_{0}|^{n}\geq0,\forall n\) and \(\sum_{n=0}^{\infty}M_{n}\) is convergent
And we know for \(|a_n||x|^n<|a_n||x_0|^n=M_n\) since \(|x|<|x_0|\) by this
By Weierstrass M-test the series converges uniformly on \([-c,c]\) with \(c=|x_0|\)
Remark
By the theorem above, we know that if a series converges absolutely at \(x = x_0\), then the limit function is continuous in \([-c, c]\), with \(c = |x_0|\), by the Continuous Limit Theorem.
Example
The series \(\sum_{n=0}^{\infty} \frac{x^n}{n!}\) defines a continuous function for any real number \(x\).
This function is called the exponential and it is denoted by \(\exp(x) = e^x = \sum_{n=0}^{\infty} \frac{x^n}{n!}.\)
Theorem (Abel's theorem)
Let \(\sum_{n=0}^{\infty} a_n x^n\) be a power series that converges at a point \(x = R > 0\).
Then the series converges uniformly on the interval \([0, R]\).
A similar result holds if the series converges at \(x = -R < 0\).
Remark
We are establishing theorems that ensure the uniform convergence of series at closed intervals.
Closed intervals are examples of compact sets in \(\mathbb{R}\).
- The finite union of closed intervals is compact sets
- Any closed interval is compact
- Any point is compact
Definition
A compact set of \(\mathbb{R}\) is a subset that is closed and bounded.
Equivalently, a set \(K\) is compact if every sequence in \(K\) has a subsequence that converges to a limit that is also in \(K\).
Remark
As any compact set \(K \subseteq \mathbb{R}\) is bounded, it follows that there exist \(x_0, x_1 \in \mathbb{R}\) such that \(x_0 \leq x \leq x_1, \quad \forall x \in K.\)
Moreover, as any sequence of \(K\) has a convergent subsequence, we can assure that actually there exist \(x_0 < x < x_1, \quad \forall x \in K\) such that \(K \subseteq [x_0, x_1],\) where \(x_0 = \text{minimum of } K\), \(x_1 = \text{maximum of } K\).
Corollary
If a power series converges at \(x_0\) and \(x_1\) with \(x_{0} < x_{1}\), then it converges uniformly on \([x_0, x_1]\)
Proof
We have three cases
-
\(0<x_0<x_1\)
Then we know it converges at \(x_1\), then it converges uniformly on \([0,x_1]\), in particular, it converges uniformly on \([x_0,x_1]\)
-
\(x_{0}<0<x_{1}\)
Since it converges at \(x_0,x_{1}\), then it converges uniformly on \([0,x_1]\) and \([x_0,0]\)
Then it converges uniformly on \([x_0,x_1]\)
-
\(x_{0}<x_{1}<0\)
Since it converges at \(x_{0}\), then it converges uniformly on \([x_0,0]\), in particular, it converges uniformly on \([x_{0},x_{1}]\)
Thus we complete the proof
Corollary 1
If \(\sum_{n=0}^{\infty} a_n x^n\) converges pointwise on \(A \subseteq \mathbb{R}\), then it converges uniformly on any compact subset \(K \subseteq A\).
Proof
Since \(\sum_{n=0}^{\infty} a_n x^n\) converges pointwise on \(A \subseteq \mathbb{R}\), then it converges pointwise on \(K=\left\lbrack x_{0},x_{1}\right\rbrack\) by definition of compact set
Then it converges on \(x_0,x_1\), then it converges uniformly on \([x_{0}, x_{1}]=K\subseteq A\).
Corollary 2
A power series is continuous at every point at which it converges.
Proof
Suppose \(\sum_{n=0}^{\infty}a_{n}x^{n}\) converges on \(x=x_0\). Assume \(x_0>0\). By Abel's theorem the series converges uniformly on \([0,x_0]\)
Hence, by the continuous limit theorem, the series is continuous at \(x_0\)
Example
\(\exp(x)=\sum^\infty_{n=0}\frac{x^n}{n!}\) is continuous on \(\R\) and \(\sum_{k=0}^{\infty}x^{k}=\frac{1}{1-x}\) is coontinuous on \((-1,1)\)
Now that we have studied continuity of limit functions of power series, let us consider differentiability.
In order to use the theorem of differentiation term by term, we need to show that the differentiated series \(\sum_{n=1}^{\infty} n a_n x^{n-1}\) converges uniformly.
Theorem
Let \(\sum_{n=0}^{\infty} a_n x^n\) be a power series and assume that \(\sum_{n=0}^{\infty} a_n x_0^n\) converges for some \(x_0\). Then the differentiated series \(\sum_{n=1}^{\infty}na_{n}x^{n-1}\) converges uniformly for all \(x \in [-a, a]\) with \(0 < a < |x_0|\).
Proof
Main idea: Theorem (Weierstrass M-test)
We know that \(\sum_{n=0}^{\infty}a_{n}x_{0}^{n}\) converges for some \(x_0\), then \(|a_{n}x_{0}^{n}|<\varepsilon\) bounded since \(a_nx^n_0\) goes to \(0\)
Then \(\exists M>0:|a_{n}x_{0}^{n}|<M\) and consider \(|na_{n}x^{n-1}|<\), then we need to know the bound of \(x\)
Since \(x\in[-a,a]\), then \(|x|<a\), then \(|x|<|x_0|\), thus \(|na_{n}x^{n-1}|<|na_{n}x_{0}^{n-1}|=\frac{\left|a_{n}x_{0}^{n}\right|}{\left|x_{0}\right|} \cdot\left|n\right|<\left|\frac{n}{x_{0}}\right|\cdot M=M_{n}\), we need to prove \(M_n\) is convergent
Consider D'Alembert Criterion (test), let \(\lim_{n\to\infty}\frac{\left|\frac{n+1}{x_0}\right|M}{\left|\frac{n}{x_0}\right|M} =1\), problem
Review the scaling, maybe it is not tight enough
Yes, it should be \(|na_{n}x^{n-1}|<|na_{n}a^{n-1}|=\frac{\left|a_{n}x_{0}^{n}\right|}{\left|x_{0}^{n}\right|} \cdot\left|n\right|\left|a\right|^{n-1}=\left|a_{n}x_{0}^{n}\right|\cdot\frac{\left|na^{n-1}\right|}{\left|x_{0}^{n}\right|} =\left|a_{n}x_{0}^{n}\right|\cdot\frac{\left|na^{n}\right|}{\left|x_{0}^{n}\right|} \cdot\frac{1}{\left|a\right|}=\left|a_{n}x_{0}^{n}\right|\cdot\left|\frac{a}{x_{0}} \right|^{n}\cdot\frac{\left|n\right|}{\left|a\right|}<M\cdot\left|\frac{a}{x_{0}} \right|^{n}\frac{\left|n\right|}{\left|a\right|}=M_{n}\),
Let's check again. Consider \(\lim_{n\to\infty}\frac{M_{n+1}}{M_{n}}=\lim_{n\to\infty}\frac{M\cdot\left|\frac{a}{x_0}\right|^{n+1}\frac{\left|n+1\right|}{\left|a\right|}}{M\cdot\left|\frac{a}{x_0}\right|^{n}\frac{\left|n\right|}{\left|a\right|}} =\lim_{n\to\infty}\frac{\left|\frac{a}{x_0}\right|\left|n+1\right|}{\left|n\right|} =\left|\frac{a}{x_{0}}\right|<1\)
Thus \(\sum_{n=0}^{\infty}M_{n}\) is convergent, then by Theorem (Weierstrass M-test), \(\sum_{n=1}^{\infty}na_{n}x^{n-1}\) converges uniformly
Corollary
Let \(\sum_{n=0}^{\infty} a_n x^n\) be a series that converges to a function \(f\) on an interval \(A\). Then the function \(f\) is continuous on \(A\) and it is differentiable on any open interval \((-R, R) \subseteq A\).
The derivative is \(f'(x) = \sum_{n=1}^{\infty}n a_{n} x^{n-1}.\) Moreover, \(f\) is infinitely differentiable on \((-R, R)\).
Proof
By A power series is continuous at every point at which it converges., we know \(f\) is continuous on \(A\)
By theorem above, we know \(\sum_{n=1}^{\infty}na_{n}x^{n-1}\) converges uniformly to \(g\)
By Theorem (Term by term differentiability), we know \(f^{\prime}\left(x\right)=g\Rightarrow f^{\prime}\left(x\right)=\sum_{n=1}^{\infty} na_{n}x^{n-1}.\)
Example
\(\sum_{n=0}^{\infty}x^{n}\) converges at \([-1,1)\) and we know \(f(x)=\sum_{n=0}^{\infty}x^{n}=\frac{1}{1-x}\) continuous at \([-1,1)\) and \(f^{\prime}(x)=\sum_{n=0}^{\infty}nx^{n-1}\) is differentiable at \((-1,1)\).
Note we cannot put \(-1\) in the interval, since if \(x=-1\), it is divergent