In linear algebra, a circulant matrix is a square matrix in which all rows are composed of the same elements and each row is rotated one element to the right relative to the preceding row. It is a particular kind of Toeplitz matrix.
An circulant matrix takes the form
or the transpose of this form (by choice of notation). If each is a square matrix, then the matrix is called a block-circulant matrix.
A circulant matrix is fully specified by one vector, , which appears as the first column (or row) of . The remaining columns (and rows, resp.) of are each cyclic permutations of the vector with offset equal to the column (or row, resp.) index, if lines are indexed from to . (Cyclic permutation of rows has the same effect as cyclic permutation of columns.) The last row of is the vector shifted by one in reverse.
Different sources define the circulant matrix in different ways, for example as above, or with the vector corresponding to the first row rather than the first column of the matrix; and possibly with a different direction of shift (which is sometimes called an anti-circulant matrix).
The polynomial is called the associated polynomial of the matrix .
(This can be understood by realizing that multiplication with a circulant matrix implements a convolution. In Fourier space, convolutions become multiplication. Hence the product of a circulant matrix with a Fourier mode yields a multiple of that Fourier mode, i.e. it is an eigenvector.)
As a consequence of the explicit formula for the eigenvalues above,
the determinant of a circulant matrix can be computed as:
Since taking the transpose does not change the eigenvalues of a matrix, an equivalent formulation is
Rank
The rank of a circulant matrix is equal to where is the degree of the polynomial .[2]
Other properties
Any circulant is a matrix polynomial (namely, the associated polynomial) in the cyclic permutation matrix: where is given by the companion matrix
There are important connections between circulant matrices and the DFT matrices. In fact, it can be shown that
where is the first column of . The eigenvalues of are given by the product . This product can be readily calculated by a fast Fourier transform.[3]
Circulant matrices can be interpreted geometrically, which explains the connection with the discrete Fourier transform.
Consider vectors in as functions on the integers with period , (i.e., as periodic bi-infinite sequences: ) or equivalently, as functions on the cyclic group of order (denoted or ) geometrically, on (the vertices of) the regular -gon: this is a discrete analog to periodic functions on the real line or circle.
(recall that the sequences are periodic)
which is the product of the vector by the circulant matrix for .
The discrete Fourier transform then converts convolution into multiplication, which in the matrix setting corresponds to diagonalization.
The -algebra of all circulant matrices with complex entries is isomorphic to the group -algebra of
Symmetric circulant matrices
For a symmetric circulant matrix one has the extra condition that .
Thus it is determined by elements.
The eigenvalues of any real symmetric matrix are real.
The corresponding eigenvalues become:
for even, and
for odd, where denotes the real part of .
This can be further simplified by using the fact that and depending on even or odd.
The complex version of the circulant matrix, ubiquitous in communications theory, is usually Hermitian. In this case and its determinant and all eigenvalues are real.
If n is even the first two rows necessarily takes the form
in which the first element in the top second half-row is real.
If n is odd we get
Tee[5] has discussed constraints on the eigenvalues for the Hermitian condition.
Applications
In linear equations
Given a matrix equation
where is a circulant matrix of size , we can write the equation as the circular convolution
where is the first column of , and the vectors , and are cyclically extended in each direction. Using the circular convolution theorem, we can use the discrete Fourier transform to transform the cyclic convolution into component-wise multiplication
so that