Image and Kernel (subspace)
Definition. The image of a function consists of all the values the function assumes. If is a function from to , then
Notice that im() is a subset of .
Definition. The kernel of a function whose range is consists of all the values in its domain at which the function assumes the value . If is a function from to , then
Notice that ker() is a subset of X. Also, if is a linear transformation from to , then ker() (also denote ker()) is the set of solutions to the equation .
Theorem. Let be an matrix. Then the following statements are equivalent.
- is invertible
- The linear system has a unique solution for every
- rref() =
- rank() =
- im() =
- ker() =
Consider a linear mapping T.
An invariant subspace W of T has the property that all vectors v \in W are transformed by T into vectors also contained in W. This can be stated as
For having a vector space these two operations must satisfy the eight axioms, which are listed in the following table below, where the equations must be satisfied for every , and in , and and in .
Associativity of vector addition
Commutativity of vector addition
Identity element of vector addition
There exists an element , called the zero vector, such that for all .
Inverse elements of vector addition
For every , there exists an element , called the additive inverse of , such that .
Compatibility of scalar multiplication with field multiplication
Identity element of scalar multiplication
, where denotes the multiplicative identity in .
Distributivity of scalar multiplication with respect to vector addition
Distributivity of scalar multiplication with respect to field addition
is the set of positive real numbers, that is and where vector addition is defined as and scalar multiplication is defined as . Prove the above is vector spaces.
lambda matrix and Jordan
Smith normal form
Where di(A) (called i-th determinant divisor) equals the greatest common divisor of all i x i minors of the matrix A
Jordan normal form
In general, a square complex matrix A is similar to a block diagonal matrix
Conjugate symmetry or Hermitian symmetry:
positive definiteness :
normal unitary spaces:
Work on geometric space
Any real square matrix A may be decomposed as
Q is a unitary matrix.
where Q is an orthogonal matrix (its columns are orthogonal unit vectors and R is an upper triangular matrix (also called right triangular matrix). If A is invertible, then the factorization is unique if we require the diagonal elements of R to be positive.
The Schur decomposition reads as follows: if A is an n × n square matrix with complex entries, then A can be expressed as
where Q is a unitary matrix, and U is an upper triangular matrix, which is called a Schur form of A. Since U is similar to A, it has the same spectrum, and since it is triangular, its eigenvalues are the diagonal entries of U.
- The eigenvalue of A is belong to real number.
necessary and sufficient condition:
We can get eigenvectors (V) and eigenvalues by A.
Application: (linear mapping),(decoupling)