On this page, I systematically explore the representation of infinitesimal
rotations in 3-dimensional Euclidean space using the language of differential
forms. We establish the relation to matrix multiplication rules, as well as the
direct relation to the cross product. In preparation for the analysis of
rotations in 4-dimensional Minkowski spacetime, we determine the symmetries of
the exterior product in mixed form, which, in this case, are found to be
trivial.
Free matrix representation is intuitive when using a bivector basis,
since the elements can be organized and re-ordered at will. With three
dimensions, rotations are possible on the three planes and can be expressed as
a linear combinations the three basis bivectors:
\[R^{♯♯} = a \; ∂_y ∧ ∂_z + b \; ∂_z ∧ ∂_x + c \; ∂_x ∧ ∂_y\]
We can rewrite as a single column:
We could also represent the rotation with a row/column notation:
\[\begin{split}R^{♯♯} = \left[ \begin{alignedat}{2}
& c \; ∂_x ∧ ∂_y & \\
& & a \; ∂_y ∧ ∂_z \\
b \; ∂_z ∧ ∂_x & & \\
\end{alignedat} \right]\end{split}\]
However there is a more natural representation. The exterior product is
anti-symmetric \(∂_i ∧ ∂_j = - ∂_j ∧ ∂_i\) and strictly equivalent to
\(∂_i ∧ ∂_j = \frac{1}{2} (∂_i ∧ ∂_j - ∂_j ∧ ∂_i)\), which permits to
rewrite:
\[\begin{split}R^{♯♯} = \begin{bmatrix}
a \; ∂_y ∧ ∂_z \\
b \; ∂_z ∧ ∂_x \\
c \; ∂_x ∧ ∂_y \\
\end{bmatrix} =
\begin{bmatrix}
a \; ∂_y ∧ ∂_z - a \; ∂_z ∧ ∂_y \\
b \; ∂_z ∧ ∂_x - b \; ∂_x ∧ ∂_z \\
c \; ∂_x ∧ ∂_y - c \; ∂_y ∧ ∂_x \\
\end{bmatrix}\end{split}\]
With a row/column representation, we obtain a fully anti-symmetric and doubly
contravariant representation:
The doubly contravariant rotation obtained exclusively operates on covectors.
Falling back to matrix multiplication rules requires a mixed tensor that takes
a vector as input, and output a vector as output. Specifically, we need to
flatten the first component in oder to obtain the \(♯♭\) tensor
representation, which corresponds exactly to the matrix representation commonly
encountered in linear algebra.
Whether as a transpose or not, we identify the \(\mathfrak{so}(3)\)
matrices as well as get a first hint that we are about to identify the
electromagnetic tensor. Choosing the implicit basis \(\mathbf{e}_i \wedge
\mathbf{e}_j\) in a row major representation, we obtain:
Rotations in three dimensions have a dual. We can either express a rotation
along the three planes, or we can express a rotation along the three directions
of space. Indeed, through the use of the Hodge star \(⋆\), we fall back
to the description of rotations expressed as a cross product \(⨯\):
Apply the Hodge star:
\[⋆R = ⋆(a \; ∂_y ∧ ∂_z + b \; ∂_z ∧ ∂_x + c \; ∂_x ∧ ∂_y)\]
Distribute the Hodge star:
\[⋆R = a ⋆(∂_y ∧ ∂_z) + b ⋆(∂_z ∧ ∂_x) + c ⋆(∂_x ∧ ∂_y)\]
Identify the cross product:
\[⋆R = a \; ∂_x + b \; ∂_y + c \; ∂_z\]
That is, the Hodge star of the rotation ∂_xpressed as a linear comibination of
bivectors is exactly a rotation in terms of cross products in the Hodge dual
space:
\[⋆R = a \; ∂_y ⨯ ∂_z + b \; ∂_z ⨯ ∂_x + c \; ∂_x ⨯ ∂_y\]
We could have written a covector in the same explicit manner. This notation is
very conveniant when performing calculations in Cartan’s framework as it
permits to identify and organize terms for practical calculations by falling
back to regular matrix multiplication.