Skip to content

3.3 Sequences of functions

Recall definition

Convergence

A sequence \((a_n)\) converges to a real numbers \(a\in \R\) if \(\forall \varepsilon >0\), \(\exists N\in\mathbb{N},\forall n\geq N:|a_{n}-a|<\varepsilon\)

Cauchy Sequence

A sequence \((a_n)\) is called a Cauchy Sequence if \(\forall\varepsilon>0,\exists N>0,\forall m,n\in\mathbb{N},m>n>N:|a_{m}-a_{n}|<\varepsilon\)

Sequences of Functions

Assume that for each \(n \in \mathbb{N}\) we have a function \(f_n: A \to \mathbb{R}\) where \(A\subseteq \R\) is a non-empty subset.

A sequence of functions \((f_n)_{n \in \mathbb{N}}=:(f_n)\) on \(A\) is a collection of functions \(f_n: A \to \mathbb{R}\).

Converges Pointwise

A sequence of functions \((f_n)\) converges pointwise to a function \(f\) on \(A\) if for \(x\in A\), \(\lim_{n \to \infty} f_n(x) = f(x)\)

That is: \(\forall\varepsilon>0,\forall x\in\mathbb{R},\exists N=N(x),\forall n\geq N:|f_{n} (x)-f(x)|<\varepsilon\) (fix \(x\))

Notation: \(\lim_{n \to \infty} f_n(x) = f(x)\) or \(f_n\xrightarrow[n\to\infty]{}f\) or \(f_n\to f\)

Example

  1. image

    Here we get: \(\lim_{n\to\infty}x^{n+1}= \begin{cases} 0\quad \text{if } -1<x<1\\1\quad \text{if } x=1\\\text{diverges} \quad \text{otherwise} \end{cases}\)

  2. image

    Because we cannot find a \(N\) big enough to ensure all the function is the windows \((g-\varepsilon,g+\varepsilon)\)

Converges Uniformly

Let \((f_n)\) be a sequence of functions on a set \(A \subseteq \mathbb{R}\)

We say that \((f_n)\) converges uniformly to a function \(f\) on \(A\) if \(\forall \varepsilon > 0\), \(\exists N \in \mathbb{N}\) such that \(\forall n\geq N:|f_{n}(x)-f(x)|<\varepsilon,\forall x\in A\)

Notation: We write \(f_n \xrightarrow []{u} f\)

Construct uniformly convergent functions

The key is to find a bounded \(f(x)\). This function can be bounded by its domain (\(1<x<2\)) and its form (\(\frac{1}{1+x^{2}}\)​)

Example

  1. \(f_n(x)=\frac{1}{n(1+x^2)}\)

    This is uniformly convergent since \(|\frac{1}{n(1+x^{2})}-0|=\frac{1}{n(1+x^{2})}<\frac{1}{n}<\varepsilon\)

  2. ​​​image

Remark

Converges uniformly \(\Rightarrow\) Converges pointwise, but the converse is not true

Continuous Limit Theorem

If for \(f_{n},f:A\to\mathbb{R}\), we have \(f_n\) are continuous and \(f_n \xrightarrow{u} f\), then \(f\) is continuous.

Proof

Main idea: Definition and use continuous of \(f_n\) and uniform convergence to bound

Let \(\varepsilon >0\), we want \(|f(x)-f(a)|\), we have for \(x\in \R\), \(\forall \varepsilon > 0\), \(\exists N \in \mathbb{N},\forall n\geq N\) such that \(|f_{n}(x)-f(x)|<\varepsilon,\forall x\in A\)

Thus we need to put \(x=a,x=x\), we get

  1. \(x=a\), \(\forall \varepsilon > 0\), \(\exists N \in \mathbb{N},\forall n\geq N\) such that \(|f_{n}(a)-f(a)|<\frac{\varepsilon}{3}\)

  2. \(x=x\), \(\forall \varepsilon > 0\), \(\exists N \in \mathbb{N},\forall n\geq N\) such that \(|f_{n}(x)-f(x)|<\frac{\varepsilon}{3}\)

Then we use ( +1 -1) here

\(|f(x)-f(a)|=|f(x)-f_{n}\left(x)+f_{n}\left(x\right)-f_{n}\left(a\right)+f_{n}\left (a\right)-f(a\right)|\)

\(\leq|f(x)-f_{n}\left(x)\left|+\right|f_{n}\left(x\right)-f_{n}\left(a\right)\left |+\right|f_{n}\left(a\right)-f(a\right)|<\varepsilon\)

For the first and third term, we use (1) and (2)

For the mid term, this satisfies the definition of continuous, then \(\forall \varepsilon > 0, \exists \delta > 0\) such that \(|x-a|<\delta:|f_{n}(x)-f_{n}(a)|<\frac{\varepsilon}{3}\).

Thus we choose \(N\), \(\delta\), we get \(\forall\varepsilon>0,\exists\delta>0\) such that \(|x-a|<\delta:|f(x)-f(a)|<\varepsilon\)

Example

  1. \(f_n:A\to \R\) and \(f_{n}(x)=\frac{x^{n}}{n},A=\left\lbrack0,1\right\rbrack\)

    Easily, \(\lim_{n\to\infty}f_{n}(x)=0\) (pointwise) and \(f_{n}(x)\xrightarrow{u}f(x)=0\) (scaling to \(1\)​), then \(f(x)=0\) is continuous

  2. This theorem is also a useful #Method tool: Determine uniform convergence#​:Continuous Limit Theorem

    If we cannot use this theorem to determine, then use definition

    \(f_{n}:\mathbb{R}\to\mathbb{R}\) and \(f_{n}(x)=\frac{x}{1+x^{n}}\) is convergent pointwise to \(\lim_{n\to\infty}g_{n}(x)= \begin{cases} x , \text{if }x\in[0,1) \\ \frac{1}{2} ,\text{if }x=1 \\ 0 , \text{if }x\in(1,\infty) \end{cases}\)

    This is not uniform convergent since if it is, then \(g(x)\) must be continuous

Cauchy Criterion for Uniform Convergence

A sequence of functions \((f_n)\) defined on \(A \subset \mathbb{R}\)

\((f_n)\) converges uniformly on \(A\) if and only if \(\forall \varepsilon > 0\) \(,\exists N\in\mathbb{N},\forall m,n\geq N:\) \(|f_{n}(x)-f_{m}(x)|<\epsilon\quad\forall x\in A\)

Proof

\(\Rightarrow\)) Since \(f_n\) is uniform convergent on \(A\), then assume \(f_n\xrightarrow{u}g\), that is \(\forall \varepsilon > 0\) \(,\exists N\in\mathbb{N},\forall n\geq N:\) \(|f_{n}(x)-g\left(x\right)|<\epsilon\quad\forall x\in A\)

Then assume \(m>n\), then \(|f_{n}(x)-f_{m}(x)|=|f_{n}(x)+g\left(x)-g\left(x\right)-f_{m}(x\right)|\leq\left |f_{n}\left(x\right)-g\left(x\right)\right|+\left|f_{m}\left(x\right)-g\left(x\right )\right|<\varepsilon\)

\(\Leftarrow\)) We know \(f_n\) is pointwise convergent iff \(f_n\) is a Cauchy sequence

Let \(x\in A\), then \((f_n(x))\) is pointwise convergent, then NTP: pointwise convergent\(\implies\)uniform convergent

That is need to prove \(f_n(x)\xrightarrow{u}g(x),\forall x\in A\). We know that the hypothesis is independent of \(x\), then we want to start at hypothesis to imply uniform convergence

Then \(|f_{n}(x)-g\left(x\right)|=|f_n(x)-\lim_{m\to\infty}f_m(x)|=\lim_{m\to\infty}|f_n(x)-f_m(x)|\leq\lim_{m\to\infty}\varepsilon=\varepsilon\)


This trick we have used three times, the first time is to prove equivalence between two formula of exponential function

The second time is to prove differentiable limit theorem as below, the third time is to prove Cauchy Criterion for uniform convergence as above

​#Method tool: Eliminating variable with limit#​ If we want \(n\) but we have \(n,m\), then if the limit of \(m\) exists and independent of \(m\), we can use limit of \(m\) to get rid of \(m\)

Remark

This is also a #Method tool: Determine uniform convergence#​

And is also a #Method tool: Cauchy Criterions to deal with unknown limit#​: Cauchy Criterion for Uniform Convergence

Differentiable Limit Theorem

Let \((f_n)\) be a sequence that \(f_n\to f\) converges pointwise on a closed interval \([a, b]\)

Assume \(f_n\) is differentiable for \(\forall n \geq 1\). If \(f_{n}'\xrightarrow{u}g\) on \([a,b]\), then \(f\) is differentiable and \(f' = g\).

Remark: Note that we need here that the domain should be a closed bounded set.

Main idea: get the form like \(|...|<\varepsilon\)

Trick: \(+1-1\) and Mean Value Theorem and #Method tool: Eliminating variable with limit#​

Proof

Need to prove \(f^{\prime}\left(x_{0}\right)=\lim_{x\to x_0}\frac{f\left(x\right)-f\left(x_{0}\right)}{x-x_{0}}=g(x_0)\) exists \(\forall x_0\in [a,b]\)

This is equivalent to prove \(\forall \varepsilon>0,\exists\delta>0,\forall x\in\left(x_{0}-\delta,x_{0}+\delta\right):\) \(\left|\frac{f\left(x\right)-f\left(x_{0}\right)}{x-x_{0}}-g(x_{0})\right|<\varepsilon\) (Main idea)

Since \(f_n\to f\) converges pointwise (1), \(f_n\) is differentiable (2) and \(f_n'\xrightarrow{u} g\) on \([a,b]\) (3)

(1) \(\forall\varepsilon_{1}>0,x\in\left\lbrack a,b\right\rbrack,\exists N_{1},\forall n\geq N_{1}:|f_{n}(x)-f(x)|<\varepsilon_{1}\)

(2) \(\forall\varepsilon_{2}>0,\exists\delta>0,\forall x\in\left(x_{0}-\delta,x_{0}+\delta \right):\) \(\left|\frac{f_{n}\left(x\right)-f_{n}\left(x_{0}\right)}{x-x_{0}}-f_{n}^{\prime} \left(x_{0}\right)\right|<\varepsilon_{2}\)

(3) \(\forall\varepsilon_{3}>0,\exists N_{3},\forall n\geq N_{3}:|f_{n}^{\prime}(x)-g( x)|<\varepsilon_{3},\forall x\in\left\lbrack a,b\right\rbrack\)

Then consider \(\left|\frac{f\left(x\right)-f\left(x_{0}\right)}{x-x_{0}}+\frac{f_{n}\left(x\right)-f_{n}\left(x_{0}\right)}{x-x_{0}} -\frac{f_{n}\left(x\right)-f_{n}\left(x_{0}\right)}{x-x_{0}}+f_{n}^{\prime}\left( x_{0}\right)-f_{n}^{\prime}\left(x_{0}\right)-g(x_{0})\right|\)

\(\leq \left| \frac{f(x) - f(x_{0})}{x - x_{0}}- \frac{f_{n}(x) - f_{n}(x_{0})}{x - x_{0}}\right| + \left| \frac{f_{n}(x) - f_{n}(x_{0})}{x - x_{0}}- f_{n}'(x_{0}) \right| + | f_{n}'(x_{0}) - g(x_{0}) |\)

For the second term, use (2) and let \(\varepsilon_2=\frac{\varepsilon}{3}\), for the third term, use (3) and let \(\varepsilon_3=\frac{\varepsilon}{3}\), \(x=x_0\)

For the first term, Maybe we can use Cauchy Criterion for Uniform Convergence to (3)? Because the first term is similar to a part of Mean Value Theorem

(3)* \(\forall\varepsilon_{4}>0\) \(,\exists N_{4}\in\mathbb{N},\forall m,n\geq N_{4}:\) \(|f_{n}^{\prime}(x)-f_{m}^{\prime}(x)|<\epsilon_{4}\quad\forall x\in\left\lbrack a,b\right\rbrack\)

Mean Value Theorem: There exists an \(\alpha \in (x_0, x)\) such that \(f_{n}^{\prime}(\alpha)-f_{m}^{\prime}(\alpha)=\frac{(f_{n}(x)-f_{m}(x))-(f_{n}(x_{0})-f_{m}(x_{0}))}{x-x_{0}}\)

Then choose \(N=\max\left(N_{1},N_{3},N_{4}\right)\) and \(\varepsilon_4=\frac{\varepsilon}{3}\), then \(\left|\frac{(f_{n}(x)-f_{m}(x))-(f_{n}(x_{0})-f_{m}(x_{0}))}{x-x_{0}}\right|<\frac{\varepsilon}{3}\)

We need to take off \(m\), then take limit of \(m\to \infty\), then \(\left|\frac{(f_{n}(x)-f(x))-(f_{n}(x_{0})-f(x_{0}))}{x-x_{0}}\right|<\frac{\varepsilon}{3}\) since \(f_n\to f\) converges pointwise

Then use this, we complete the proof

Note

We should notice that we can not combine these two hypothesis \(f_n\) is differentiable and \(f_{n}'\xrightarrow{u}g\)

Because \(f'_n\) is a limit and \(f_{n}'\xrightarrow{u}g\) is also a limit, then if we combine these two, we will get a \(|...|<\) containing limit, instead of \(\left|\frac{f_{n}(x)-f_{n}(x_{0})}{x-x_{0}}-g\left(x\right)\right|<\ldots\)(This is wrong!!! \(f'_n\) is a limit instead of \(\frac{f_{n}(x)-f_{n}(x_{0})}{x-x_{0}}\)​)

Summary

Why we need the hypothesis "\(f_n'\xrightarrow{u} g\) on \([a,b]\)"?

  1. If not, we cannot imply that \(f\) is differentiable although \(f_n\to f\) converges pointwise

    Counterexample:

    image

    image

  2. And this hypothesis ensures that \(f\) is differentiable since \(f'_n\) get closer to \(g\)(a fixed function)

The second version of Differentiable Limit Theorem

Let \((f_n)\) be a sequence of differentiable functions on \([a, b]\) and assume that \((f_n')\) converges uniformly to a function \(g\) on \([a, b]\). If there exists \(x_0 \in [a, b]\) for which \((f_n(x_0))\) is a convergent sequence, then \((f_n)\) converges uniformly. Moreover, the limit \(f = \lim_{n \to \infty} f_n\) is differentiable and \(f' = g\).

Proof

First we prove \((f_n)\) converges uniformly on \([a,b]\)​, then combine this and above theorem we can easily get the proof

Proof

Since existing \(x_0 \in [a, b]\) s.t. \((f_n(x_0))\) is a convergent sequence (1) and \(f_n'\xrightarrow{u} g\) on \([a,b]\) (2)

We use Cauchy sequence because we don't know the limit same #Method tool: Cauchy Criterions to deal with unknown limit#​ as this

(1) \(\forall\varepsilon_{1}>0,x_{0}\in\left\lbrack a,b\right\rbrack,\exists N_{1}\in\mathbb{N} ,\forall n,m\geq N_{1}:|f_{n}(x_{0})-f_{m}\left(x_{0}\right)|<\varepsilon_{1}\)

(2) \(\forall\varepsilon_{3}>0,\exists N_{3},\forall n\geq N_{3}:|f_{n}^{\prime}(x)-g( x)|<\varepsilon_{3},\forall x\in\left\lbrack a,b\right\rbrack\)

Then consider \(\left|f_{n}\left(x\right)-f_{m}(x)\right|=\left|f_{n}\left(x\right)+f_{n}\left(x _{0}\right)-f_{n}\left(x_{0}\right)+f_{m}\left(x_{0}\right)-f_{m}\left(x_{0}\right )-f_{m}\left(x\right)\right|\)

\(\leq\left|f_{n}\left(x_{0}\right)-f_{m}\left(x_{0}\right)\right|+\left|f_{n}\left (x\right)-f_{m}\left(x\right)-\left(f_{n}\left(x_{0}\right)-f_{m}\left(x_{0}\right )\right)\right|\)

And we know \(\exists\alpha\in(x_{0},x):f_{n}^{\prime}(\alpha)-f_{m}^{\prime}(\alpha)=\frac{f_{n}\left(x\right)-f_{m}\left(x\right)-\left(f_{n}\left(x_{0}\right)-f_{m}\left(x_{0}\right)\right)}{x-x_{0}} <\varepsilon_{3}\)

Then \(f_{n}\left(x\right)-f_{m}\left(x\right)-\left(f_{n}\left(x_{0}\right)-f_{m}\left (x_{0}\right)\right)<\varepsilon_{3}\cdot\left(x-x_{0}\right)\)