for every permutation σ of the symbols {1, 2, ..., r}. Alternatively, a symmetric tensor of order r represented in coordinates as a quantity with r indices satisfies
The space of all symmetric tensors of order k defined on V is often denoted by Sk(V) or Symk(V). It is itself a vector space, and if V has dimension N then the dimension of Symk(V) is the binomial coefficient
We then construct Sym(V) as the direct sum of Symk(V) for k = 0,1,2,...
Given a Riemannian manifold equipped with its Levi-Civita connection , the covariant curvature tensor is a symmetric order 2 tensor over the vector space of differential 2-forms. This corresponds to the fact that, viewing , we have the symmetry between the first and second pairs of arguments in addition to antisymmetry within each pair: .[1]
Symmetric part of a tensor
Suppose is a vector space over a field of characteristic 0. If T ∈ V⊗k is a tensor of order , then the symmetric part of is the symmetric tensor defined by
The components of the tensor appearing on the right are often denoted by
with parentheses () around the indices being symmetrized. Square brackets [] are used to indicate anti-symmetrization.
Symmetric product
If T is a simple tensor, given as a pure tensor product
then the symmetric part of T is the symmetric product of the factors:
In general we can turn Sym(V) into an algebra by defining the commutative and associative product ⊙.[2] Given two tensors T1 ∈ Symk1(V) and T2 ∈ Symk2(V), we use the symmetrization operator to define:
It can be verified (as is done by Kostrikin and Manin[2]) that the resulting product is in fact commutative and associative. In some cases the operator is omitted: T1T2 = T1 ⊙ T2.
In some cases an exponential notation is used:
Where v is a vector.
Again, in some cases the ⊙ is left out:
Decomposition
In analogy with the theory of symmetric matrices, a (real) symmetric tensor of order 2 can be "diagonalized". More precisely, for any tensor T ∈ Sym2(V), there is an integer r, non-zero unit vectors v1,...,vr ∈ V and weights λ1,...,λr such that
The minimum number r for which such a decomposition is possible is the (symmetric) rank of T. The vectors appearing in this minimal expression are the principal axes of the tensor, and generally have an important physical meaning. For example, the principal axes of the inertia tensor define the Poinsot's ellipsoid representing the moment of inertia. Also see Sylvester's law of inertia.
For symmetric tensors of arbitrary order k, decompositions
are also possible. The minimum number r for which such a decomposition is possible is the symmetricrank of T.[3] This minimal decomposition is called a Waring decomposition; it is a symmetric form of the tensor rank decomposition. For second-order tensors this corresponds to the rank of the matrix representing the tensor in any basis, and it is well known that the maximum rank is equal to the dimension of the underlying vector space. However, for higher orders this need not hold: the rank can be higher than the number of dimensions in the underlying vector space. Moreover, the rank and symmetric rank of a symmetric tensor may differ.[4]
Greub, Werner Hildbert (1967), Multilinear algebra, Die Grundlehren der Mathematischen Wissenschaften, Band 136, Springer-Verlag New York, Inc., New York, MR0224623.