19 Matrix eigenvalue problem
TABLE OF CONTENTS
1. Introduction
2. Complex vector spaces
2.1 The inner product
3. The characteristic equation
4. Symmetric, antisymmetric and orthogonal matrices
4.1 Orthogonal transformations
4.2 Properties of orthogonal matrices
4.3 Eigenbasis
5. Similar matrices
5.1 Diagonalization of a matrix
6. Unitary, hermitian and antihermitian matrices
6.1 Eigenvalues of unitary, hermitian and antihermitian matrices
6.2 Eigenvectors of a hermitian matrix
6.3 Invariance of inner product
LEARNING OBJECTIVES
1. Idea of eigenvalue of a matrix is introduced and its importance discussed.
2. A complex vector space is introduced as a prelude to eigenvalue problem.
3. The characteristic equation of a matrix and its relation to eigenvalues and eigenvectors obtained.
4. Symmetric, antisymmetric and orthogonal matrices are defined. Orthogonal transformations are introduced and their properties discussed.
5. Similar matrices are defined and its importance in diagonalization of a matrix discussed.
6. Unitary, hermitian and antihermitian matrices are defined and properties of their eigenvalues proved.
M a t r i x e i g e n v a l u e p r o b l e m
1. Introduction
In this unit we will be dealing exclusively with square matrices. Let A be a square matrix of order n and x a column vector of order n. Then the “operation” of A on x, will yield some other column vector y:
We can look upon the matrix as a transformation in the n-dimensional vector space of column vectors. The operation of A on x “transforms” it into, in general, a different vector y in that vector space. However there exist certain special nonzero vectors which have the property that the matrix A operating on them yields simply a multiple of the same vector. If x is one such special vector then
The problem is to be viewed as given a matrix A, we have to find the unknown column vector x and the unknown scalar λ so that equation (1) is satisfied. One solution to this problem is x = 0. This is the trivial solution and is of no interest to us. The problem of systematically finding such special nonzero vectors for a given square matrix and the corresponding scalars λ is referred to as the eigenvalue problem. And this is the problem that we will address in this unit.
The nonzero vectors xi that satisfies equation (1) are called eigenvectors or characteristic vectors of A and the corresponding scalars λi are called the eigenvalues or characteristic values of A. The set of all eigenvectors of a matrix A is called its spectrum. The largest of the absolute value of the eigenvalues of A is called the spectral radius of A.
This rather innocent looking matrix equation leads to a theory with applications in such diverse fields as engineering, physics, geometry, mathematics, biology, environmental science, urban planning, economics, psychology, and many others.
It will soon become clear that even if the matrix A is real, the eigenvalues as well as eigenfunctions are in general complex. So perforce we have to deal with complex numbers and complex vector spaces. Hence before we begin the problem of finding eigenvalues and eigenvectors, we will introduce complex vector spaces. They have same properties as real vector spaces, except for the differences that crop up due to scalars being complex numbers. We will also introduce certain special matrices which are of particular interest in the study of eigenvalue problem.
2. Complex vector spaces
A complex vector space is a vector space in which the scalars are complex numbers. It has all the properties of real vector spaces dealt with in the last unit, including linear independence, dimensionality and basis. Thus, for example, the space , consisting of n-tuples of complex numbers, is n-dimensional. In 3, the vectors
(i, 0, 0), (0, i, i), (0, 0, i)
where f1 and f2 are real functions of a real variable. The scalars consist of the set of all complex numbers. Then it is easy to verify that these functions form a complex vector space.
2.1 The inner product
In terms of the components of the vectors the norm and the distance are respectively
If two vectors u and v are such that their inner product is zero, the vectors are said to be orthogonal. If further both the vectors have been “normalized”, that is, have unit norm, then they are said to be orthonormal. In an n-dimensional vector space we can always choose n vectors which have unit norm and are mutually orthogonal. That is, we can always choose an orthonormal basis.
Example
3. The characteristic equation
Our objective now is to find the eigenvalues and the eigenvectors of a given matrix A ={aij}. By taking the λx term to the left hand side in equation (1), it can be rewritten in the form
By Cramer’s rule of the last unit this homogeneous system of equations in xi has a nontrivial solution if, and only if, the determinant of the coefficient matrix is identically zero. That is
The matrix A–λI is called the characteristic matrix and the determinant D(λ) is called the characteristicdeterminant of the matrix A. When the determinant D(λ) is written in the expanded form it will be a sum of n! terms, each term will consist of exactly n factors. Only one of the terms will consist of the factors
Thus the determinant is a polynomial of degree n in λ. Equation (3) is called the characteristic equation and the polynomial is called the characteristic polynomial of A. Thus we have the important theorem that:
Eigenvalues of a matrix A are the roots of the characteristic equation of A.
Example-1
Example-2
In this example the matrix is real but the eigenvalues and eigenvectors are complex.
4. Symmetric, antisymmetric and orthogonal matrices
Though we have introduced complex spaces and matrices, for the moment we continue with real matrices and consider three classes of real square matrices that, because of their remarkable properties, occur quite frequently in applications.
A (real) square matrix A is called symmetric, if AT= A, and antisymmetric if AT= -A.
A (real) square matrix is orthogonal if, and only if, AT= A-1.
Equivalently, an orthogonal matrix can be defined by the requirement AAT= ATA = I, the unit matrix.
Examples
For the eigenvalues of symmetric and skew-symmetric matrices we have the following results which we will prove in the context of complex matrices.
(i) The eigenvalues of a symmetric matrix are real.
(ii) The eigenvalues of a skew-symmetric matrix are pure imaginary or zero.
4.1 Orthogonal transformations
We have already remarked that the matrix A can be viewed as effecting a transformation in the space ℝn which transforms a vector x into a vector y:
y = Ax
If a matrix A is orthogonal, the corresponding transformation is called an orthogonal transformation. It can be shown that every rotation in a plane or three dimensional space, or n-dimensional space for that matter, is an orthogonal transformation. For example a rotation by an angle θ in a plane is given by the orthogonal matrix
4.2 Properties of orthogonal matrices
Orthogonal transformations and therefore matrices have some very important and useful properties.
1. An orthogonal transformation preserves the inner product in ℝn.
As a corollary it follows that the norm of a vector is also preserved.
Proof
2. A real square matrix is orthogonal if and only if its column vectors (c1, c2, …, cn) (and also its row vectors) form an orthonormal system. That is,
Proof
3. The determinant of an orthogonal matrix has the value +1 or -1.
Proof
Determinants have the property that det(AB) = det(A) det(B) and det(A) = det(AT). From these two properties the result follows:
4.3 Eigenbasis
Thus in terms of the eigenbasis, the effect of the matrix A on an arbitrary vector is to multiply the coefficient of each eigenvector by a scalar. That is why it is important to know if an eigenbasis exists.
Theorem
The following theorem provides a criterion for existence of eigenbasis: If all the n eigenvalues of an nxn matrix A are distinct, then the eigenvectors form a basis in ℝn ( or ℂn ).
Proof
5. Similar matrices
Proof
Example
5.1 Diagonalization of a matrix
Given an nxn matrix A, if there exists a matrix similar to A which is diagonal in form, we say the matrix A has been diagonalized. It may not be always possible to diagonalize a given matrix. However, the following theorem gives a criterion under which the matrix can be diagonalized:
(i) If the eigenvectors of an nxn matrix A form a basis, then
is diagonal, with the eigenvalues of A as the entries on the main diagonal. Here X is the matrix whose columns are these eigenvectors.
6. Unitary, hermitian and antihermitian matrices
The hermitian conjugate or transpose conjugate of a square matrix A, denoted by A* is the matrix which is transpose as well as the conjugate of A:
A matrix is said to be
6.1 Eigenvalues of unitary, hermitian and antihermitian matrices
We have a remarkable theorem about the eigenvalues of unitary, hermitian and antihermitian matrices:
1. The eigenvalues of a hermitian matrix (and thus of a real symmetric matrix) are real.
2. The eigenvalues of a skew-hermitian matrix (and thus of a real skew-symmetric matrix) are pure imaginary.
3. The eigenvalues of a unitary matrix (and thus of a real orthogonal matrix) have absolute value 1.
Proof
1. Let λ be an eigenvalue and x the corresponding eigenvector of A: Ax = λx. Multiply this equation by x* from the left, so that
is real and positive definite since x being an eigenvector cannot be the null vector. Therefore dividing equation (16) by x*x, we have
6.2 Eigenvectors of a hermitian matrix
For the eigenvectors of a hermitian matrix we have the important theorem: If A is hermitian, its eigenvectors corresponding to distinct eigenvalues are orthogonal.
Proof
6.3 Invariance of inner product
SUMMARY
- We introduce the concept of eigenvalue of a matrix and discuss its importance in various fields of learning.
- Since the eigenvalue of even a real matrix can be complex, we introduce the complex vector space as a prelude to study of eigenvalue problem.
- Next we introduce the characteristic equation of a matrix and obtain its relation to eigenvalues and eigenvectors of the matrix.
- Next we define the very important class of matrices, the symmetric, antisymmetric and orthogonal matrices. Then we introduce orthogonal transformations and describe properties of orthogonal transformations and matrices.
- We define similar matrices and discuss their importance in diagonalization of a matrix.
- Finally, we define unitary, hermitian and antihermitian matrices and prove various properties of their eigenvalues and eigenvectors.
you can view video on Matrix eigenvalue problem |