A square matrix is said to be in upper Hessenberg form or to be an upper Hessenberg matrix if for all with .
An upper Hessenberg matrix is called unreduced if all subdiagonal entries are nonzero, i.e. if for all .[3]
Lower Hessenberg matrix
A square matrix is said to be in lower Hessenberg form or to be a lower Hessenberg matrix if its transpose is an upper Hessenberg matrix or equivalently if for all with .
A lower Hessenberg matrix is called unreduced if all superdiagonal entries are nonzero, i.e. if for all .
Examples
Consider the following matrices.
The matrix is an upper unreduced Hessenberg matrix, is a lower unreduced Hessenberg matrix and is a lower Hessenberg matrix but is not unreduced.
Computer programming
Many linear algebra algorithms require significantly less computational effort when applied to triangular matrices, and this improvement often carries over to Hessenberg matrices as well. If the constraints of a linear algebra problem do not allow a general matrix to be conveniently reduced to a triangular one, reduction to Hessenberg form is often the next best thing. In fact, reduction of any matrix to a Hessenberg form can be achieved in a finite number of steps (for example, through Householder's transformation of unitary similarity transforms). Subsequent reduction of Hessenberg matrix to a triangular matrix can be achieved through iterative procedures, such as shifted QR-factorization. In eigenvalue algorithms, the Hessenberg matrix can be further reduced to a triangular matrix through Shifted QR-factorization combined with deflation steps. Reducing a general matrix to a Hessenberg matrix and then reducing further to a triangular matrix, instead of directly reducing a general matrix to a triangular matrix, often economizes the arithmetic involved in the QR algorithm for eigenvalue problems.
Reduction to Hessenberg matrix
Householder transformations
Any matrix can be transformed into a Hessenberg matrix by a similarity transformation using Householder transformations. The following procedure for such a transformation is adapted from A Second Course In Linear Algebra by Garcia & Roger.[4]
Let be any real or complex matrix, then let be the submatrix of constructed by removing the first row in and let be the first column of . Construct the householder matrix where
This householder matrix will map to and as such, the block matrix will map the matrix to the matrix which has only zeros below the second entry of the first column. Now construct householder matrix in a similar manner as such that maps the first column of to , where is the submatrix of constructed by removing the first row and the first column of , then let which maps to the matrix which has only zeros below the first and second entry of the subdiagonal. Now construct and then in a similar manner, but for the matrix constructed by removing the first row and first column of and proceed as in the previous steps. Continue like this for a total of steps.
By construction of , the first columns of any matrix are invariant under multiplication by from the right. Hence, any matrix can be transformed to an upper Hessenberg matrix by a similarity transformation of the form .
Jacobi (Givens) rotations
A Jacobi rotation (also called Givens rotation) is an orthogonal matrix transformation in the form
where , , is the Jacobi rotation matrix with all matrix elements equal zero except for
One can zero the matrix element by choosing
the rotation angle to satisfy the equation
Now, the sequence of such Jacobi rotations with the following
reduces the matrix to the lower Hessenberg form.[5]
Properties
For , it is vacuously true that every matrix is both upper Hessenberg, and lower Hessenberg.[6]
The product of a Hessenberg matrix with a triangular matrix is again Hessenberg. More precisely, if is upper Hessenberg and is upper triangular, then and are upper Hessenberg.
A matrix that is both upper Hessenberg and lower Hessenberg is a tridiagonal matrix, of which the Jacobi matrix is an important example. This includes the symmetric or Hermitian Hessenberg matrices. A Hermitian matrix can be reduced to tri-diagonal real symmetric matrices.[7]
^Biswa Nath Datta (2010) Numerical Linear Algebra and Applications, 2nd Ed., Society for Industrial and Applied Mathematics (SIAM) ISBN978-0-89871-685-6, p. 307
^Ramon Garcia, Stephan; Horn, Roger (2017). A Second Course In Linear Algebra. Cambridge University Press. ISBN9781107103818.
^Bini, Dario A.; Robol, Leonardo (2016). "Quasiseparable Hessenberg reduction of real diagonal plus low rank matrices and applications". Linear Algebra and Its Applications. 502: 186–213. arXiv:1501.07812. doi:10.1016/j.laa.2015.08.026.