The reflection hyperplane can be defined by its normal vector, a unit vector (a vector with length ) that is orthogonal to the hyperplane. The reflection of a point about this hyperplane is the linear transformation:
A Householder matrix has eigenvalues . To see this, notice that if is orthogonal to the vector which was used to create the reflector, then , i.e., is an eigenvalue of multiplicity , since there are independent vectors orthogonal to . Also, notice (since is by definition a unit vector), and so is an eigenvalue with multiplicity .
The determinant of a Householder reflector is , since the determinant of a matrix is the product of its eigenvalues, in this case one of which is with the remainder being (as in the previous point), or via the Matrix determinant lemma.
Householder transformations are widely used in numerical linear algebra, for example, to annihilate the entries below the main diagonal of a matrix,[2] to perform QR decompositions and in the first step of the QR algorithm. They are also widely used for transforming to a Hessenberg form. For symmetric or Hermitian matrices, the symmetry can be preserved, resulting in tridiagonalization.[3]
Householder transformations can be used to calculate a QR decomposition. Consider a matrix tridiangularized up to column , then our goal is to construct such Householder matrices that act upon the principal submatrices of a given matrix
via the matrix
.
(note that we already established before that Householder transformations are unitary matrices, and since the multiplication of unitary matrices is itself a unitary matrix, this gives us the unitary matrix of the QR decomposition)
If we can find a so that
we could accomplish this. Thinking geometrically speaking, we are looking for a plane so that the reflection of about the plane happens to land directly on the basis vector. In other words,
1
for some constant . However, this also shows that
.
And since is a unit vector, this means that
2
Now if we apply equation (2) back into equation (1).
Or, in other words, by comparing the scalars in front of the vector we must have
.
Or
Which means that we can solve for as
This completes the construction; however, in practice we want to avoid catastrophic cancellation in equation (2). To do so, we choose the sign of so that
This procedure is presented in Numerical Analysis by Burden and Faires. It uses a slightly altered function with .[5]
In the first step, to form the Householder matrix in each step we need to determine and , which are:
From and , construct vector :
where , , and
for each
Then compute:
Having found and computed the process is repeated for as follows:
Continuing in this manner, the tridiagonal and symmetric matrix is formed.
Examples
In this example, also from Burden and Faires,[5] the given matrix is transformed to the similar tridiagonal matrix A3 by using the Householder method.
Following those steps in the Householder method, we have:
The first Householder matrix:
Used to form
As we can see, the final result is a tridiagonal symmetric matrix which is similar to the original one. The process is finished after two steps.
As unitary matrices are useful in quantum computation, and Householder transformations are unitary, they are very useful in quantum computing. One of the central algorithms where they're useful is Grover's algorithm, where we are trying to solve for a representation of an oracle function represented by what turns out to be a Householder transformation:
(here the is part of the Bra-ket notation and is analogous to which we were using previously)
This is done via an algorithm that iterates via the oracle function and another operator known as the Grover diffusion operator defined by
and .
Computational and theoretical relationship to other unitary transformations
The Householder transformation is a reflection about a hyperplane with unit normal vector , as stated earlier. An -by-unitary transformation satisfies . Taking the determinant (-th power of the geometric mean) and trace (proportional to arithmetic mean) of a unitary matrix reveals that its eigenvalues have unit modulus. This can be seen directly and swiftly:
For the case of real valued unitary matrices we obtain orthogonal matrices, . It follows rather readily (see orthogonal matrix) that any orthogonal matrix can be decomposed into a product of 2 by 2 rotations, called Givens Rotations, and Householder reflections. This is appealing intuitively since multiplication of a vector by an orthogonal matrix preserves the length of that vector, and rotations and reflections exhaust the set of (real valued) geometric operations that render invariant a vector's length.
The Householder transformation was shown to have a one-to-one relationship with the canonical coset decomposition of unitary matrices defined in group theory, which can be used to parametrize unitary operators in a very efficient manner.[6]
Finally we note that a single Householder transform, unlike a solitary Givens transform, can act on all columns of a matrix, and as such exhibits the lowest computational cost for QR decomposition and tridiagonalization. The penalty for this "computational optimality" is, of course, that Householder operations cannot be as deeply or efficiently parallelized. As such Householder is preferred for dense matrices on sequential machines, whilst Givens is preferred on sparse matrices, and/or parallel machines.
Press, WH; Teukolsky, SA; Vetterling, WT; Flannery, BP (2007). "Section 11.3.2. Householder Method". Numerical Recipes: The Art of Scientific Computing (3rd ed.). New York: Cambridge University Press. ISBN978-0-521-88068-8. Archived from the original on 2011-08-11. Retrieved 2011-08-13.