Analysis is the art of taking limits.

Contents Index

## Sequences

### Sequences of real numbers

This section deals with the $\lim_{n \to \infty} a_n$

**Definition (Convergence)** A sequence $(a_n)^{\infty}_{n=N}$ is convergent to $L$ iff the sequence is $L$ is eventually $\epsilon$-close to $L$ for every $\epsilon > 0$

$$ (\forall \epsilon) (\exists N) (\forall n \geq N) ( |L – a_n| < \epsilon ) $$

**Theorem (Arithmetic Limit Laws)**

If $\lim a_n$ and $\lim b_n$ exists, then

$$ \lim_{n \to \infty} (a_n + b_n) = \lim_{n \to \infty} a_n + \lim_{n \to \infty} b_n $$

$$ \lim_{n \to \infty} (a_n b_n) = (\lim_{n \to \infty} a_n) (\lim_{n \to \infty} b_n) $$

$$ \lim_{n \to \infty} (c a_n) = c \lim_{n \to \infty} a_n $$

$$ \lim_{n \to \infty} \frac{a_n}{b_n} = \frac{ \lim_{n \to \infty} a_n } { \lim_{n \to \infty} b_n } $$

**Theorem (Order Limit Laws) **

$$(\lim a_n = a) \land (\lim b_n=b) \land (a_n \leq b_n) \implies a \leq b$$

**Definition (sup/inf of sequences)** Let $(a_n)_{n=m}^\infty$ be a sequence of real numbers. Then we define $sup(a_n)_{n=m}^{\infty} := sup( \{ a_n : n \geq m \} )$

**Proposition (property of sequence sup)**

- (sup is an upper bound) $ (\forall n \geq m) (a_n \leq sup(a_n)_{n=m}^{\infty}) $
- (sup is the least upper bound) $(\forall y < sup(a_n)_{n=m}^{\infty}) (\exists n \geq m) (y \leq a_n \leq sup(a_n)_{n=m}^{\infty}) $

#### Limit Points

Summary:

- limit points is a more general concept of limit. limit, limsup and liminf are instances of limit points.
- Limit requires
**all****points**to be close enough to its convergence eventually, whereas limit points only ask forto be close enough.**infinite points** - limsup and liminf are also special cases of limit points. Unlike limits, limsup and liminf exist for every sequence. I think they are useful to estimate the boundaries of sequences when they are oscillating.

**Definition (limit points, adherent points over sequence)** $x$ is a limit points (adherent point) of $(a_n)_{n=m}^{\infty}$ when

$$ (\forall \epsilon) ( \forall N \geq m ) (\exists n \geq N) ( |x – a_n| \leq \epsilon ) $$

Intuitively, the sequence visits $\epsilon$ neighborhood of a limit point infinitely often (In contrast with the eventual visit of limits)

**Definition (limsup/liminf)** limsup is the inferior over supremum of all elements. liminf is the superior over inferiors. (Both are defined over the extended reals to handle the divergence case)

$$a_N^{+} := sup(a_n)_{n=N}^{\infty}$$

$$\limsup a_n := inf(a_N^{+})_{N=m}^{\infty}$$

Note limsup and liminf always exist for any sequence (it might be infity though)

**Proposition (properties of limsup/liminf) **Let $L^+$ be limsup and $L^-$ be liminf of sequence $(a_n)_{n=m}^{\infty}$. Then

- $\forall x > L^+$, the sequence $(a_n)_{n=m}^{\infty}$ are eventually less than $x$
- $\forall x < L^+$, the sequence $(a_n)_{n=m}^{\infty}$ exceeds $x$ infinitely often
- $\inf(a_n)_{n=m}^{\infty} \leq L^- \leq L^+ \leq \sup(a_n)_{n=m}^{\infty}$
- If $c$ is any limit points, then $L^- \leq c \leq L^+$
- If $L^+$ is finite, then it is a limit point
- $\lim_{n \to \infty} a_n \Longleftrightarrow c L^+ = L^- = c$

**Lemma (sup/limsup preserves order) **if $a_n \leq b_n$, then

$$ \sup (a_n)_{n=m}^{\infty} \leq \sup (b_n)_{n=m}^{\infty}$$

$$ \limsup (a_n)_{n=m}^{\infty} \leq \limsup (b_n)_{n=m}^{\infty}$$

Proof: by contradition. Suppose $\sup (a_n)_{n=m}^{\infty} > \sup (b_n)_{n=m}^{\infty}$, then take an $epsilon$ smaller than two sup diff, there exists one $a_n$ be $epsilon$ closer enough to its sup. This $a_n$ is larger than b’s sup, so its larger than every b.

#### Convergence Criterions

The followings are some criterions useful to decide sequence convergence.

**Criterion (Cauchy) **A sequence converges iff it is a Cauchy sequence

**Criterion (Monotone bounded convergence)** Let $(a_n)_{n=m}^{\infty}$ be a sequence of real numbers which has some finite upper bound $M \in R$ and which is also increasing. Then $(a_n)_{n=m}^{\infty}$ is convergent and

$$ \lim_{n \to \infty} a_n = sup(a_n)_{n=m}^{\infty} \leq M$$

**Criterion (squeeze test)**

$$ (a_n \leq b_n \leq c_n) \land (L = \lim a_n = \lim c_n) \Longrightarrow \lim b_n = L $$

**Criterion (Cesaro means)** If $x_n$ converges, then the mean sequence $y_n$ also converges to the same limit

$$y_n = \frac{1}{\sum_{i=i}^{n} x_i}$$

### Subsequences

**Definition (subsequence) **$(b_n)$ iis a subsequence of $(a_n)$ iff there exists a function $f: \mathbb{N} \to \mathbb{N}$ that is strictly increasing such that $b_n = a_{f(n)}$

**Propositions (limits of subsequences)** Following statements are equivalent

- sequence $(a_n)$ converges to $L$
- every subsequence of $(a_n)$ converges to $L$

**Proposition (limit points of subsequences)** Following statements are equivalent

- $L$ is a limit point of sequence $(a_n)$
- There exists a subsequence of $(a_n)$ which converges to $L$

**Theorem (Bolzano-Weirstrass)** A bounded sequence has at least one convergent subsequence

Proof: sequence bounded -> limsup is bounded -> limsup is a limit point -> subsequence exists

### Series

**Definition (convergence of infinite series)** The convergence of series $\sum_{k=0}^{\infty} a_k$ is defined in terms of the convergence of partial sum $s_n = \sum_{k=1}^n a_k$

**Theorem (Algebraic limit theorem for series)** If $\sum_{k=0}^{\infty} a_k = A$ and $\sum_{k=0}^{\infty} b_k = B$, then

$$\sum_{k=0}^{\infty} c a_k = cA$$

$$\sum_{k=0}^{\infty} (a_k + b_k) = A + B$$

**Definition (absolute convergence)** If the $\sum_{k=0}^{\infty} |a_k|$ converges, then $\sum_{k=0}^{\infty} a_k$ will converge and is called *absolute convergence*. If $\sum_{k=0}^{\infty} |a_k|$ not converge but $\sum_{k=0}^{\infty} a_k$ converges, then $\sum_{k=0}^{\infty} a_k$ is called *conditional convergence*

**Theorem (rearangement on absolute convergence)** If a series converges absolutely, then any rearangement of this series converges to the same limit

**Theorem (rearangement over double sum)** if $\sum_{i}^{\infty} \sum_{j}^{\infty} | a_{ij} |$ converges, then rearangement is allowed and gives the same value

$$\sum_{i}^{\infty} \sum_{j}^{\infty} a_{ij} = \sum_j^{\infty} \sum_i^{\infty} a_{ij} = \lim_{n \to \infty} \sum_{i}^n \sum_j^n a_{ij} $$

#### Convergence Criterions

Generally, speaking, it is much easier to determine whether a sequence converges or not rather than to compute the actual sum. For example, the series

$$\sum_{n=1}^{\infty} \frac{1}{n^2}$$

can be easily confirmed that converges to something less than 2, but is much harder to compute its sum $\frac{\pi^2}{6}$, which by the way discovered by Euler. Followings are some useful criterions to determine whether an series converges or not

**Criterion (Cauchy)** The series $\sum_{k=0}^{\infty} a_k$ converges iff

$$ (\forall \epsilon) (\exists N) (\forall n > m \geq N) |a_{m+1} + … + a{n}| < \epsilon$$

**Criterion (comparison test)** Assume $a_k$ and $b_k$ are sequences satisfying $(\forall k \in N)0 \leq a_k \leq b_k$

if $\sum_{k=1}^{\infty} b_k$ converges, then $\sum_{k=1}^{\infty} a_k$ converges

if $\sum_{k=1}^{\infty} b_k$ diverges, then $\sum_{k=1}^{\infty} a_k$ diverges

**Criterion (absolute convergence test) **If series absolutely converges, then it converges as well

**Criterion (Leibniz, alternating series test)** Let $(a_n)$ be a sequence that $a_1 > a_2 > a_3 …$ and $a_n \to 0$. Then the alternating series $\sum_{n=1}^{\infty} (-1)^{n+1} a_n$ converges.

**Criterion (ratio test) **Given a series $\sum_{n=1}^{\infty} a_n$, it will converges absolutely if

$$\lim | \frac{a_{n+1}}{a_n}| = r < 1 $$

**Criterion (Dirichlet’s test)** If the partial sum of $\sum_{k=1}^{\infty} x_k$ are bounded, and $y_k$ are monotone decreasing to 0, then $\sum_{k=1}^{\infty} x_k y_k$ converges

## Continuity

continuously differentiable < Lipschitz continuous < uniform continuous < continuous

### Limits of Functions

Generally speaking, we want the value of $\lim_{x \to c} f(x)$ to be independent of how we approach $c$

**Definition (functional limit)** Suppose $x_0$ is a limit point of $E$, $lim_{x \to x_0; x \in E} f(x) = L$ iff $(\forall \epsilon > 0)(\exists \delta > 0)(\forall{x}: 0 < |x – x_0 | < \delta \land x \in E) |f(x) – L| \leq \epsilon$

Note: It is important to remember that limit points of $E$ do not necessarily belong to $E$ unless it is closed. In fact, $x_0$ need not to be in the domain of $f$.

**Criterion (sequences)** following are equvalent

- $ \lim_{x \to x_0; x \in E} f(x) = L$
- $(a_n) \to x_0 \implies f(a_n) \to L$ where $a_n \in E$

**Criterion (divergence)** functional limit $lim_{x \to c} f(x)$ does not exist if

$$lim_{x_n} = lim_{y_n}.= c \land lim f(x_n) \neq lim f(y_n)$$

**Theorem (Algebraic Limit Theorem)** limit laws for functions

$$lim_{x \to x_0} (f \pm g)(x) = \lim_{x \to x_0} f(x) \pm \lim_{x \to x_0} g(x)$$

$$lim_{x \to x_0} \min(f, g)(x) = \min (\lim_{x \to x_0} f(x), \lim_{x \to x_0} g(x))$$

$$lim_{x \to x_0} \max(f, g)(x) = \max(\lim_{x \to x_0} f(x), \lim_{x \to x_0} g(x))$$

$$lim_{x \to x_0} (f g)(x) = \lim_{x \to x_0} f(x) \lim_{x \to x_0} g(x)$$

$$lim_{x \to x_0} (f / g)(x) = \frac{\lim_{x \to x_0} f(x)}{\lim_{x \to x_0} g(x)}$$

**Proposition (limits are local)** following are equivalent

- $\lim_{x \to x_0; x \in E} f(x) = L$
- $(\forall \delta \in R) \lim_{x \to x_0; x \in E \cup (x – \delta, x+\delta)} f(x) = L$

### Continuity

**Definition (Continuity) **$f: X \to R$ is continuous at $x_0 \in X$ iff

$$ \lim_{x \to x_0; x \in X} f(x) = f(x_0)$$

Continuous function preserves the convergence (i.e. : $x_n \to x_o \implies f(x_n) \to f(x_o)$)

The most important point here is that the point $x_0$ has to be in the domain of $f$

**Definition (discontinuity)**

- If $\lim_{x \to c} f(x)$ exists but has a value different from $f(c)$, the discontinuity is called
*removable* - If $\lim_{x \to c+} f(x) \neq \lim_{x \to c-} f(x)$, then $f$ has a
*jump discontinuity*at c - If $\lim_{x \to c}$ does not exist for other reasons, the discontinuity is called
*essential*

**Criterion (Arithmetic operations and composition)** if $f, g$ are continuous at $x_0$, then $f+g, f-g, f/g, fg, \min(f,g), \max(f,g), g \circ f$ are also continuous at $x_0$

**Definition (left, right limits) **

$$(\forall x_0 \in \overline{X \cap (x_0, \infty)}) f(x_0 +) := \lim_{x \to x_0; x \in X \cap (x_0, \infty)} f(x)$$

$$(x_0 \in \overline{X \cap (-\infty, x_0)}) f(x_0 -) := \lim_{x \to x_0; x \in X \cap (-\infty, x_0)} f(x)$$

**Proposition (left right limits and continuity)** If $f(x_0+), f(x_0-)$ both exist and equal to $f(x_0)$, then $f$ is continuous at $x_0$

#### Continuity and Compactness

**Theorem (preservation of compact sets)** Let $f: A \to R$ be continuous on $A$. If $K \subseteq A$ is compact, then $f(K)$ is compact as well

**Proposition (extreme value theorem)** If $f: A \to R$ is continuous on a compact set $K\ subset R$, then $f$ attains a maximum and mimum value.

#### Continuity and Connectedness

**Theorem (preservation of connected sets)** Let $f: G \to R$ be continuous. If $E \subseteq G$ is connected, then $f(E)$ is connected as well

**Theorem (Intermediate value theorem)** Let $f: [a,b] \to R$ be a continuous function on $[a,b]$. Let $y$ be a real number between $f(a), f(b)$. Then there exists $c \in [a,b]$ such that $f(c)=y$

Conversely, a function having the intermediate value property does not necessariy need to be continuous.

### Uniform Continuity

continuity is a local property, but uniform continuity is a global property

**Definition (uniform continuity) **Let $X \subset R$ and let $f: X \to R$. We say that $f$ is uniformly continuous if

$$(\forall \epsilon > 0) (\exists \delta > 0) (\forall |x-x_0| < \epsilon) |f(x)-f(x_0)|<\delta$$

**Definition (equivalent sequences)** Let $m$ be an integer . Let $(a_n)_{n=m}^{\infty}, (b_n)_{n=m}^{\infty}, $ be two sequences of real numbers, and let $\epsilon > 0$ be given.

- ($\epsilon$-close) We say that $(a_n)_{n=m}^{\infty}$ is $\epsilon$-close to $(b_n)_{n=m}^{\infty}$ iff $a_n$ is $\epsilon$-close to $b_n$ for each $n \geq m$.
- (eventually $\epsilon$-close) We say that $(a_n)_{n=m}^{\infty}$ is eventually $\epsilon$-close to $(b_n)_{n=m}^{\infty}$ iff there exists an $N \geq m$ such that the sequences $(a_n)_{n=N}^{\infty}$ and $(b_n)_{n=N}^{\infty}$ are $\epsilon$-close.
- (equivalent sequences) Two sequences $(a_n)_{n=m}^{\infty}, (b_n)_{n=m}^{\infty}$ are equivalenet iff for each $\epsilon > 0$, the sequences are eventually $\epsilon$-close.

**Lemma** Real sequences of $(a_n)_{n=1}^{\infty}, (b_n)_{n=1}^{\infty}$ are said to be equivalent iff $\lim_{n \to \infty} (a_n – b_n) = 0$

**Proposition (uniformly continuity and sequences)** following are logically equivalent

- $f: X \to R$ is uniformly continuous on $X \subset R$
- Whenever $(x_n)_{n=1}^{\infty}, (y_n)_{n=1}^{\infty}$ are two equivalent sequences consisting of elements of $X$, the sequences $(f(x_n))_{n=1}^{\infty}, (f(y_n))_{n=1}^{\infty}$ are also equivalent$

**Proposition (uniform continuity preserves Cauchy sequences)** if $f$ is a uniformly continuous function. Let $(x_n)_{n=1}^{\infty}$ be a Cauchy sequence. Then $(f(x_n))_{n=1}^{\infty}$ is also Cauchy

**Theorem (compact continuous function is uniformly continuous)** If $f: A \to R$ is continous on a compact set $A$. Then $f$ is also uniformly continuous

### Lipschitz continuity

**Definition (Lipschitz continuity)** A function $f: A \to R$ is called Lipschitz if there exists a bount $M > 0$ such that for all $x, y \in A, x \neq y$

$$|\frac{f(x)-f(y)}{x-y} | \leq M$$

**Lemma** Lipschitz continuous implies uniform continuous

Example (contraction mapping) A contractive function is a function

$$|f(x) – f(y)| \leq c |x-y|$$

where $0 < c < 1$. It is a Lipschitz continuous function

## Differentiation

### Real-value derivatives

**Definition (differentiability at a point)** Let $X$ be a subset of $R$, $x_0 \in X$ which is a limit point of $X$, $f: X \to R$. If the limit

$$\lim_{x \to x_0; x \in X \setminus \{ x_0 \}} \frac{f(x)-f(x_0)}{x – x_0}$$

converges to some real number $L$, we say that $f$ is differentiable at $x_0$ on $X$ with derivative $L$ and write $f'(x_0) := L$. Otherwise $f'(x_0)$ is undefined and $f$ is not differentiable at $x_0$.

Differentiability is a local property with respect to a point

**Proposition (Newton’s approximation)** Let $X$ be a subset of $R$, let $x_0 \in X$ be a limit point of $X$, let $f: X \to R$ a function and $L$ be a real number. Following statements are logically equivalent:

$f$ is differentiable at $x_0$ on $X$ with derivative $L$

for every $\epsilon > 0$ there exists a $\delta > 0$ such that $f(x)$ is $\epsilon |x-x_0|$-close to $f(x_0) + L(x-x_0)$ whenever $x \in X$ is $\delta$-close to $x_0$

$$(\forall x |x-x_0| \leq \delta) |f(x) – (f(x_0) + L(x-x_0))| \leq \epsilon |x-x_0|$$

**Definition (differentiability on domain)** Let $X$ be a subset of $R$ and let $f: X \to R$ be a function. $f$ is differentiable on $X$ iff $f$ is differentiable at every limit point $x_0 \in X$

### Directional and Partial derivatives

**Definition (directional derivative) **Let $E$ be a subset of $R^n$, f$E \to R^m$ be a function, let $x_0$ be an interior point of $E$, and let $v$ be a vector in $R^n$. If the limit

$$\lim_{t \to 0; t>0; x_0+tv \in E} \frac{f(x_0+tv) – f(x_0)}{t}$$

exists, we say $f$ is differentiable in the direction $v$ at $x_0$, and we denote the above limit by $D_v f(x_0)$

$$(D_v) f (x_0) = \lim_{t \to 0; t>0}\frac{f(x_0+tv)-f(x_0)}{t}$$

**Lemma** Suppose $f(x)=(f_1(x), f_2(x), …, f_n(x))$, then $f$ is directional differentiable with $e$ is equivalent to the statement that for all $i$, $f_i$ is directional differentiable with $e$.

**Definition (partial derivative) **Let $E$ be a subset of $R^n$, $f: E \to R^m$ and $x_0$ is an interior point of $E$, $1\leq j \leq n$. Then the partial derivative of $f$ with respect to $x_j$ variable at $x_0$, denoted $\frac{\partial f}{\partial x_j}(x_0)$ is defined by

$$\frac{\partial f}{\partial x_j}(x_0) = \lim_{t \to 0; t \neq 0; x_0 + te_j \in E} \frac{f(x_0+te_j)-f(x_0)}{t}$$

The definition of differentiability for one variable cannot be directly to multivariable function $f: R^n \to R^m$ because the denominator and nominator are vectors of different dimension.

$$\lim_{x \to x_0} \frac{f(x)-f(x_0)}{x-x_0}$$

Instead of this definition, we interpret that $f$ is approximately linear near $x_0$, which $f(x) \approx f(x_0) + L(x-x_0)$, then the previous definition can be defined as

$$\lim \frac{|f(x) – (f(x_0)+L(x-x_0))|}{|x-x_0|} = 0$$

where the denominator and nominator are both scalar, this definition can be applied in the multivariable function as follows:

**Definition (total differentiability)** Let $E$ be a subset of $R^n$, $f: E \to R^m$ be a function. $x_0 \in E$ be a point, let $L: R^n \to R^m$ be a linear transformation. We say $f$ is differentiable at $x_0$ with derivative $L$ if we have

$$\lim_{x \to x_0; x \in E – x_0} \frac{|| f(x) – (f(x_0)+L(x-x_0))|| }{|| x-x_0||} = 0$$

where $L$ is called the total derivative or Jacobian matrix.

**Proposition (partial and total differentiability)** If all the partial derivatives $\frac{\partial f}{\partial x_j} exists and continuous on $x_0$, then $f$ is differentiable at $x_0$ and the linear transformation $f'(x_0)$ is defined by

$$f'(x_0)(v_j)_{1\leq j \leq n} = \sum_j v_j \frac{\partial f}{\partial x_j}(x_0)$$

### Local maxima, minima, derivatives

**Definition (local maxima, minima)** $f$ attains a local maximum at $x_0$ iff there exists a $\delta > 0$ such that the restriction $f|_{X \cap (x_0-\delta, x_0+delta)}$ of $f$ attains a maximum at $x_0$

**Theorem (interior extreme theorem, fermat)** If $x_0 \in (a,b)$, f is differentiable at $x_0$, and $f$ attains either a local maximum or local minimum, then $f'(x_0) = 0$

Derivative functions might not be continuous, but it will satisfy the intermediate value properties as follows

**Theorem (Darboux)** If $f$ is differentiable on an interval $[a,b]$, and if $\alpha$ satisfies $f'(a) < \alpha < f'(b)$, then there exists a point $c \in (a,b)$ where $f'(c) = \alpha$

### Mean Value Theorem

**Theorem (Rolle)** Let $a<b$ be real numbers, and let $g: [a,b] \to R$ be a continuous function which is differentiable on $(a,b)$. Suppose also that $g(a)=g(b)$. Then there exists an $x \in (a,b)$ such that $g'(x) = 0$

**Theorem (mean value theorem)** If $f: [a,b] \to R$ is continuous on $[a,b]$ and differentiable on $(a,b)$, then there exists a point $c \in (a,b)$ where

$$f'(c) = \frac{f(b)-f(a)}{b-a}$$

**Theorem (generalized mean value theorem)** If $f, g$ are continuously on the closed interval $[a,b]$ and differentiable on the open interval $(a,b)$, then there exists a point $c \in (a,b)$ where

$$[f(b) – f(a)]g'(c) = [g(b) – g(a)]f'(c)$$

## Riemann Integral

continuous $\subset$ piecewise continuous $\subset$ continuous almost everywhere (discontinuity has measure zero)

### Partitions

**Definition (length of intervals)** If $I$ is a bounded interval, we define the length of $I$, denoted $|I|$. If $I$ is one of tthe intervals $[a,b], (a,b), [a,b), (a,b]$ for some real numbers $a<b$, then we define $|I| := b-a$. Otherwise if $I$ is a point or the empty set, we define $|I|=0$

**Definition (Partitions) **Let $I$ be a bounded interval. A partition of $I$ is a finite set $P$ of bounded intervals contained in $I$, such that every $x \in I$ lies in exactly one of the bounded intervals $J \in P$

**Theorem (length is finitely additive)** Let $I$ be a bounded interval, n be a natural number, and let $P$ be a partition of $I$ of cardinality $n$ Then

$$|I| = \sum_{J \in P} |J|$$

**Definition (finer and coarser partitions) **Let $I$ be a bounded interval, $P, P’$ be two partitions of $I$. We say that $P’$ is finer than $P$ if for every $J \in P’$, there exists a $K \in P$ such that $J \subseteq K$

**Definition (common refinement)** Let $I$ be a bounded interval, $P, P’$ be two partitions of $I$. We define the common refinement $P \# P’$ to be the set

$$P \# P’ := \{ K \cap J : K \in P \land J \in P’ \}$$

**Definition (lower sum, upper sum)** The lower sum, upper sum of a bounded function $f$ with respect to partition $P$ is given by

$$L(f,P) := \sum_{k=1}^{n} m_k (x_k – x_{k-1})$$

$$U(f,P) := \sum_{k=1}^{n} m_k (x_k – x_{k-1})$$

where $m_k := \inf\{ f(x): x\in [x_{k-1}, x_k] \}$, $M_k := \sup\{ f(x): x\in [x_{k-1}, x_k] \}$

### Integrability

**Definition (upper integral, lower integral) **Let $\mathcal{P}$ be the collection of all possible partitions of the interval $[a,b]$. The upper integral, lower integral of $f$ is defined to be

$$U(f) := \inf \{ U(f,P) : P \in \mathcal{P} \}$$

$$L(f) := \sup \{ L(f,P) : P \in \mathcal{P} \}$$

**Definition (Riemann Integrability)** A bounded function $f$ defined on the interval $[a,b]$ is Riemann-integrable if $U(f) = L(f)$. In this case

$$\int_{a}^b f = U(f) = L(f)$$

**Criterion (Integrability Criterion) **A bounded function $f$ is integrable on $[a,b]$ iff for every $\epsilon > 0$ there exists a partition $P_{\epsilon}$ of $[a,b]$ such that

$$U(f, P_{\epsilon}) – L(f, P_{\epsilon}) < \epsilon$$

**Theorem (integrability over closed interval)** If $f$ is continuous on $[a,b]$, then it is integrable

**Theorem** If $f: [a,b] \to R$ is bounded and $f$ is integrable on $[c,b]$ for all $c \in (a,b)$, then $f$ is integrable on $[a,b]$

Any function with a finite number of discontinuity is still integrable

**Theroem (Lebesgue Criterion)** Let $f: [a,b] \to R$ be a bounded function. $f$ is Riemann-integrable iff the set of noncontinous point has measure zero

### Properties of the Integral

**Theorem** Assume $f: [a,b] \to R $ is bounded and $c \in (a,b)$. Then $f$ is integrable on $[a,b]$ iff $f$ is integrable on $[a,c]$ and $[c,b]$, in which case

$$\int_a^b f = \int_a^c f + \int_c^b f$$

**Theorem (Integral Limit Theorem)** Assume that $f_n \to f$ uniformly on $[a,b]$ and each $f_n$ is integrable. Then $f$ is integrable and

$$\lim_{n \to \infty} \int_{a}^{b} f_n = \int_{a}^{b} f $$

**Theorem (fundamental theorem of calculus)** if $f: [a,b] \to R$ is integrable and $F: [a,b] \to R$ is antiderivative of $f$ then

$$\int_{a}^{b} f = F(b) – F(a)$$

Let $g: [a,b] \to R$ be integrable and for $x \in [a,b]$ define

$$G(x) = \int_{a}^{x} g$$

then $G$ is continuous on $[a,b]$ If $g$ is continuous at some point $c \in [a,b]$, then $G$ is differentiable at $c$ and $G'(c) = g(c)$

## Function Series

Infinite series representations of functions are both useful and elegant. Manipulations such as differentiation and anti-derivative can lead to remarkable conclusions when handled properly, however, they are not always justified.

### Uniform Convergence

pointwise convergence is a local property, while the uniform convergence is a global property.

**Definition (pointwise convergence)** The sequence $(f_n)$ of functions converges pointwise on $A$ to a function $f$ if for all $x \in A$, the sequence of real numbers $f_n(x)$ converges to $f(x)$

The problem of pointwise convergence is that it might not preserve good properties such as continuity and differentiability. To preserve those good properties, we need to propose a more restricted class of convergence.

**Definition (uniform convergence)** The sequence $(f_n)$ converges uniformly on $A$ to a limit function $f$ defined on $A$ if for every $\epsilon > 0$, there exists an $N$ such that $\forall (n \geq N) \forall (x \in A) |f_n(x) – f(x)| < \epsilon$

Uniform convergence have a couple of good properties as follows.

**Theorem (continuity)** Let $f_n$ be a sequence of functions defined on $A \subset R$ that converges uniformly on $A$ to a function $f$. If $f_n$ is continuous at $c \in A$, then $f$ is continuous at $c$

**Theorem (differentiability)** Let $f_n$ be a sequence of differetiable functions defined on $[a,b]$ and $f’_{n}$ converges uniformly to a function $g$ on $[a,b]$. If there exists a point $x_0 \in [a,b]$ for which $f_n(x_0)$ is convergent, then $f_n$ converges uniformly. Moreover, the limit function $f=lim f_n$ is differentiable and satisfies $f’ = g$

Note: uniform convergence of $f_n$ does not even guarantee convergence of $f’_n$ (e.g: $f_n(x) = \frac{sin(nx)}{\sqrt{n}}$

**Theorem (integrability) **Assume that $f_n \to $f$ uniformly on $[a, b]$, and each $f_n$ is integrable. Then $f$ is integrable and

$$\lim_{n \to \infty} \int_a^b f_n = \int_a^b f$$

### Series of Function

**Criterion (Cauchy) **A series $\sum_{i} f_n$ converges uniformly on $A \subseteq R$ iff

$$ (\forall \epsilon > 0) (\exists N) (\forall m > m \geq N)| f_{m+1} + … + f_{n} | < \epsilon$$

**Criterion (Weierstrass M-test) **For each $n \in N$, let $f_n$ be a function defined on a set $A \subset R$ and let $M_n > 0$ be a real number satisfying

$$|f_n(x)| \leq M_n$$

where $\sum_{n=1}^{\infty} M_n$ converges, then $\sum_{n=1}^{\infty} f_n$ converges uniformly on $A$

#### Power Series

**Definition (power series)** a power series centered at $a \in \mathcal{R}$ is any series of the form

$$\sum_{n=0}^{\infty} c_n (x-a)^n$$

where $c_n$ is a sequence of real numbers not depending on $x$

**Definition (radius of convergence)** Let $\sum_{n=0}^{\infty} c_n (x-a)^n$ be a formal power series. we define the radius of convergence $R$ of this series to be

$$R := \frac{1}{\limsup_{n \to \infty} |c_n|^{1/n}} $$

**Theorem** If a power series $\sum_{n=0}^{\infty} a_n x^n$ has the raidus of convergence $R$, then it converges absolutely for any $x$ such that $|x| < R$. It diverges for any $x$ when $|x| > R$

The main implication here is that the set of convergence has to be ${0}, R$ or a bounded interval.

**Theorem (Abel)** Let $g(x) = \sum_{n=0}^{\infty} a_n x^n$ be a power series that converge at $x=R > 0$, then it converges uniformly on the interval $[0, R]$.

**Theorem** If a power series converges pointwise on the set $A \subseteq R$, then it converges uniformly on any compact set $K \subseteq A$

**Theorem** Assume $f(x)=\sum_{n=0}^{\infty} a_n x^n$ converges on an interval $A \subseteq R$. The function $f$ is continuous on $A$ and differentiable on any open interval $(-R,R) \subseteq A$. The derivative is given by $f'(x) = \sum_{n=1}^{\infty} na_n x^{n-1}$.

#### Taylor Series

**Theorem (Lagrange’s Reminder Theorem)** Let $f$ be differentiable $N+1$ times on $(-R, R)$, define $a_n = f^{(n)}(0)/n!$, Given $x \neq 0$ in $(-R,R)$, there exists a point $c$ such that $|c| < |x|$ , then the error function $E_N(x) = f(x) – S_N(x)$ is

$$E_N(x) = \frac{f^{N+1}(c)}{(N+1)!}x^{N+1}$$

**Theorem (Stone-Weierstrass)** Let $f: [a,b] \to R$ be continuous. Given $\epsilon > 0$ there exists a polynomial $p(x)$ satisfying $\forall{x \in [a,b] }$

$$| f(x) – p(x) | < \epsilon $$

Note the requirement here is only the continuity, not infinite differentiability.

Note that not all infinitely differentiable function can be represented by its Taylor series. There are cases that Taylor series not converging to the target function.

## Reference

[1] Tao, Terence. *Analysis*. Vol. 1. Hindustan Book Agency, 2006.

[2] Tao, Terence. *Analysis*. Vol. 2. Hindustan Book Agency, 2006.

[3] Abbott, Stephen. *Understanding analysis*. Vol. 2. New York: Springer, 2001.

[4] Lax, Peter D., and Maria Shea Terrell. *Multivariable Calculus with Applications*. Springer, 2017.

[5] 杉浦光夫. “解析入門 I.” *東京大学出版会*

[6] 杉浦光夫. “解析入門 II.” *東京大学出版会*