Notes while learning about functional analysis.
Metric Spaces: It is defined as the space which has a set of points, say X, and a distance function, d, where the distance function obeys three rules:
- d(x, y) > 0
- d(x, y) = d(y, x)
- d(x, y) <= d(z, x) + d(z, y) (triangle inequality)
- Examples: Rn and the Eucilidean distance, Complex plane and |x-y| norm distance.
ε-ball: Also denoted by Bε(x), are defined as {y ∈ X | d(x, y) < ε}.
Open sets: If A is a set, and A ⊂ X, it is open if every point inside Bε(x) belongs to it (for all x ∈ A).
Boundary Points: If any Bε(x) has points outside of A, no matter how small the ε value, it is the boundary piont of the set A. Here x ∈ X. Than Bε(x) is said to have points from both A and it’s compliment Ac. All the boundary points of A are denoted as ∂A.
Closed Sets: A set, A, is defined to be closed, if it’s compliment is open.
Closure: The smallest closed set, is simply, A ∪ ∂A. Remember :
- A is open if, A ∩ ∂A = ∅.
- A is closed if, A ∩ ∂A = A.
In the example, where X = (1, 3] ∪ (4, ∞), the set A = (1, 3] is both open AND closed. (Since Ac is also open, by definition, A is closed).
Convergence: A sequence is defined as a set of points arranged in increasing order, such as x1 < x2 < x3 … xn, also denoted by {xi}. A sequence is said to be convergent when, in the limit, it approaches a single value, for example, a sequence, {a}Ni converges to a single value à. Given a metric space (X, d), we can define the ε-ball, Bε(x̃), for x ∈ X, such that,
- ∀ε > 0, ∃N ∈ ℕ, ∀n >= N : d(x, x̃) < ε
which means that for any convergent sequence, the ε-ball around it contains all the sequence points (as the ε increases, it engulfs more points from the sequence hence for every ε > 0, every point in sequence is covered).
- x̃ = limn->∞ xn
Proposition: A characterstic of a closed set is: any limit of a convergent sequence is inside the closed set itself.
Proof (by contraposition)
- Let the sequence be {an} and the limit be ã. Let’s assume that A is not closed.
- It implies that the compliment of A, Ac is not open either.
- If ã ∈ Ac, then there exists at least one such point in Bε(ã) that overlaps with A, which means, Bε(ã) ∩ A ≠ ∅.
- Which means that there is a sub-sequence, {an}, which lies in A, but it’s limit does not (since ã ∈ Ac).
- Hence limn->∞ an = ã ∉ A.
Cauchy Sequence: A sequence which is defined by as
- ∀ε > 0, ∃N ∈ ℕ, ∀ n, m >= N : d(xn, xm) < ε
- It is a generalization of a convergent sequence, in which we shift our focus from the distance between the limit and all the points minimising to just the distance between the points to be minimising.
- Which mean after a certain index, N, the distance between two points is always less than ε, for any ε greater than zero.
Complete Metric Space : A metric space is said to be complete when all of it’s Cauchy sequences converge. This simply means that for a complete metric space, Cauchy sequnces = Convergent sequences.
- Example: (0, 3), where d := |x - y| is incomplete when with an index of n, m ∈ N, the distance -> 0 as n, m -> ∞ but zero is excluded from this set, hence it’s incomplete
- [0, 3] is complete, since zero is included in this set.
- (0, 3) under the metric, d := {1 if x ≠ y and 0 if x = y} is complete. Take a Cauchy sequence {xn}n∈N, where by definition, d (xn, xm) < ε. For any ε, the only way this is possible is if xn = xm (which makes d = 0), hence there is only one point, the limit after the index n (since all the points after it must be the same), hence this Cauchy sequence is convergent, which makes this metric space complete.
Norm : Let X be a 𝔽-vector space. (Where 𝔽 = {ℝ, ℂ}). The norm, denoted by ||.|| is a map that transforms a vector to [0, ∞). The norm fulfils the three properties of the distance function as well.
- ||x|| = 0, only when x = 0 (positive definite)
- ||λx|| = λ||x|| (absolutely homogeneous)
- ||x+y|| <= ||x|| + ||y|| where x, y ∈ X (triangle inequality) (Here a since you cannot pull out just the addition, and always get an inquality, it is non-linear)
Hence the space defined by (X, ||.||) is called a normed space. A normed space is a special case of a metric space (by taking the end points of a vector, it can be seen as the distance between those two points, hence the distance function of a metric space, d||.||).
Banach Space : If a given metric space with a possible norm, (X, d||.||) is also a complete one, then the underlying norm space, (X, ||.||) is called the banach space. All the properties of a metric space apply as well. The norm function is what ties the real-complex vector space and a complete metric space, hence also defining a possible Banach space.
lP (ℕ,𝔽): It is defined as a collection of all the sequences in 𝔽, which follow the condition: ∑n=1∞|x|p < ∞. Here x = {xn}, which is a sequence in 𝔽. lP becomes a vector space if the summation condition is ignored. We define a norm over such vector space, ||.||p : lP -> [0, ∞). The norm is given by the formula, (∑n=1∞|x|p)1/P, again, x is a sequence.
Assumption: lP is a 𝔽-vector space, and ||.||p is a norm on it.
Proof: (lP, ||.||p) is a banach space.Let {x(k)} be a cauchy sequence in lP. Since we are considering sequences of sequences,
x(1) = (x(1)1, x(1)2, x(1)3, x(1)4, x(1)5, …)
x(2) = (x(2)1, x(2)2, x(2)3, x(2)4, x(2)5, …)
x(3) = (x(3)1, x(3)2, x(3)3, x(3)4, x(3)5, …)
x(4) = (x(4)1, x(4)2, x(4)3, x(4)4, x(4)5, …)
x(5) = (x(5)1, x(5)2, x(5)3, x(5)4, x(5)5, …)
. = (., ., ., ., ., …)
. = (., ., ., ., ., …)
. = (., ., ., ., ., …)The goal here is to prove that the sequence on the left-hand side, x(k), has a limit, which is also in lP
We start by picking a random column, say the 4th column, call it : x(k)4. This particular sequence is, by itself, a banach space (since it is single valued space with a norm over it, ||.||p).
The normed distance between elements in this sequence will be: |x(k)4 - x(l)4|P.
These normed distance would be lesser than the combined such distance of all columns: ∑∞n=1|x(k)n - x(l)n| P
Hence it follows that: |x(k)4 - x(l)4|P < ∑∞n=1|x(k)n - x(l)nP = ||x(k) - x(l)||Pp < ε.
But, since the latter is a Cauchy sequence, the former, that is the indivisual columns, are all also a Cauchy sequence. Since these columns are a Cauchy sequence AND belong to a banach space, they have a limit as well. Let’s call the limit, x’(k).
Hence, each column has a limit. We can collect every columns limit into a sequence: x’ = (x’1, x’2, x’3, x’4, x’5, …)
Now, we have to show that our original Cauchy sequence, x(k) approaches it’s limit, x’ (which means it’s indeed a convergent sequence), or |x(k) - x’| < ε.
We start by pulling out the infinity through taking the limit, essentially restricting the columns to N.
limn->∞ |x(k)n - x’n|P.
Now, we also have a second infinite sequence, in x’. We pull that out as well:
liml->∞ limn->∞ |x(k)n - x(l)n|P.
Now, the inner normed distance should look familiar: we know that the indivisual columns can be bounded by any arbirtary upper bound, which we can select. Let it be ε’.
Since, |x(k)n - x(l)n|P < (ε’)P.
|x(k)n - x(l)n| < ε’.
Hence we simply take the limit now: |x(k) - x’| < ε’, and set ε’ = ε/2.
Hence we have proved that any sequence {x(k)} converges to a limit, x’ within some arbitary ε. This means that all Cauchy sequences are convergent, hence the (lP, ||.||p) is a banach space.(Every statement about Cauchy sequence and it’s upper-bound being ε, is true after some kth index, where k > 0.)
Inner Product measures the distance, length and the angle between two vectors in 𝔽-vector space. It is denoted by < x, y >. It follows that:
- < x, x > >= 0 and < x, x > is only equal to 0 if and only if x = 0 vector.
- < x, y > = < y, x > if 𝔽 = ℝ AND < x, y > = < y, x >conjugate if 𝔽 = ℂ
- < x, y1 + y2 > = < x, y1 > + < x, y2 > and < x, λy > = λ< x, y >, if 𝔽 = ℝ and < x, λy > = λconjugate< x, y >, if 𝔽 = ℂ
Hilbert Space. If (X, ||.||p) is a banach space, than (X, <., .>) is considered to be a Hilbert space, that is a space which has an inner product that measures distance, length and angle while also being complete metric space.
Examples of Hilbert Spaces. Here are some important examples of hilbert spaces:
- ℝn and ℂn where the norm, < x, y > is := ∑ni=1 xi yi (x-bar for complex case).
- The space l2(N, 𝔽), where the norm, < x, y > is := ∑∞i=1 xi yi (x-bar for complex case).
A quick check for the l2 being a hilbert space, but check the norm conditions
< x, x > := ∑∞i=1 xbari xi = ∑∞i=1 |x|2 which is >= 0.
but < x, x > = 0. Hence |x|2 = 0, so x has to be the zero vector< x, y > := (∑∞i=1 ybari xi)- = ∑∞i=1 (ybari xi)-
which is ∑∞i=1 yi xbari = < y, x >.< x, λy > = ∑∞i=1 xi λyi = λ ∑∞i=1 xi yi = λ< x, y >
(Here xbar or ybar means the conjugate)
Cauchy-Schwarz Inequality This inequality simply states that the inner product between two vectors is always less than or equal to the product of the norms of the two vectors. Given below is the proof of the ineqaulity, given a 𝔽-vector space, (X, < ., . >)
- ||x|| = sqrt(< x, x >)
- |< x, y >| <= ||x|| . ||y||
Proof
Let x^ = x / ||x||, which is the normalized form of the vector x
Let y|| = < x^, y > x^, which is the y vector projected onto x
Let y_ = y - y||, which is the perpendicualr, adjacent to both y and xAccording to the norm conditions
0 <= || y_ || = ||y - y|||| = ||y - < x^, y > x^||Squaring everything:
0 <= ||y_||2 = ||y - y||||2 = ||y - < x^, y > x^||2
||y - < x^, y > x^||2 = < y - < x^, y > x^, y - < x^, y > x^ >Expanding
< y - < x^, y > x^, y - < x^, y > x^ > = < y - < x^, y > x^, y > - < y - < x^, y > x^, < x^, y > x^ >Expanding further:
< y - < x^, y > x^, y > - < y - < x^, y > x^, < x^, y > x^ > = < y,y > - < < x^, y > x^, y > - < y, < x^, y > x^ > + < < x^, y > x^, < x^, y > x^ > =Norms of y and < < x^, y > x^, y > emerge:
||y||2 - (< < x^, y > x^, y > + < < x^, y > x^, y >conj) + ||< < x^, y > x^, y >||2Conjugates cancel, real part remains:
||y||2 - (2 . Re(< < x^, y > x^, y >)) + |< < x^, y > x^, y >|2 . ||x^||2< < x^, y > x^, y > = |< x^, y >|2
||x^||2 = 1Using the square formula:
||y||2 - 2 . |< x^, y >|2 + |< x^, y >|2
which is = to: ||y||2 - |< x^, y >|2 >= 0||y||2 >= |< x^, y >|2
||y||2 >= |< x/||x||, y >|2
||y||2 >= 1||x||2 (|< x, y >|2)Removing the squares
||y||.||x|| >= < x, y >.
Proof for triangle inequality
||x||+||y|| <= ||x+y||
(||x||+||y||)2 = < x+y, x+y > = ||x||2 + 2.Re(< x, y >) + ||y||2
(||x||+||y||)2 <= ||x||2 + 2|< x, y >| + ||y||2
(||x||+||y||)2 <= ||x||2 + 2||x||.||y|| + ||y||2 (cauchy-schwarz inequality)
(||x||+||y||)2 <= (||x|| + ||y||)2 ||x||+||y|| <= ||x+y||