Section 4.2 Rotations in two and three dimensions
¶Objectives
You should be able to:
- Calculate the generators of two and three dimensional rotations.
- Recover finite rotations by exponentiating the generators.
The very cool thing about Lie groups is that almost all of their structures is encoded in their linearization. This is the key insight of Lie. In calculus, you can approximate an analytic function \(f(x)\) by looking at its linearization (or tangent line) at a point, say \(x=0\text{.}\) In fact, if the function is analytic, Taylor tells you that you can completely reconstruct the function locally if you know all its derivatives at \(x=0\text{.}\) But you need to know all derivatives. A similar statement holds in more than one dimensions; you can approximate a manifold at a point by looking at its tangent space. But this is only a linearization of the space.
For Lie groups the situation is different. It turns out that you only need to know the linearization of a group near the origin to recover the whole group locally! This is because of the group structure. If you know the first derivative of the group elements (thought of as matrices) at the origin, you can recover the group elements through exponentiation. So we can study the structure of Lie groups by looking at their linearizations at the origin, which are called Lie algebras. In other words, the Lie algebra of a Lie group can be thought of as the linearization, or tangent space, of the group manifold at the origin.
To introduce the concept of Lie algebras, we will first study rotations in two and three dimensions, and then define the formal idea that applies to all Lie groups.
Subsection 4.2.1 Rotations in two dimensions
Let us start by looking at the Lie group \(SO(2)\) of rotations in two dimensions. Those are given by \(2 \times 2\) special orthogonal matrices. We know that we can write any such rotation as a matrix
in terms of an angle of rotation \(\theta \in [0,2 \pi )\text{.}\) The deep idea of Lie is that we can recover all rotations by doing a rotation by a very small angle many many times. Doesn't sound very deep, but it is. :-) More precisely, we think of infinitesimal rotations. If we take the angle \(\theta\) to be small, we can approximate \(\cos \theta \cong 1 + \mathcal{O}(\theta)^2\) and \(\sin \theta \cong \theta + \mathcal{O}(\theta)^3\text{.}\) Keeping only terms of first order in \(\theta\text{,}\) we can thus approximate the rotation matrix as
Conversely, we can do a similar linear approximation for special orthogonal matrices. If \(A\) is a \(2 \times 2\) special orthogonal matrix \(A\) which is very close to the identity matrix, we can write \(A \cong I + M\) for some “infinitesimal” \(M\text{.}\) The condition that \(A^T A = I\) implies that
where we kept only terms of first order in \(M\text{.}\) Thus \(M = - M^T\text{,}\) that is, \(M\) is an antisymmetric matrix. But in two dimensions any real antisymmetric matrix is a multiple of \(X=\begin{pmatrix} 0 \amp 1 \\ -1 \amp 0 \end{pmatrix}\text{,}\) thus we can write
for some real parameter \(\theta\) as before. So we have recovered the same first order expansion as for rotations. We call \(X\) the generator of the Lie group (more precisely, it is a matrix representation of the generator of the Lie group).
Checkpoint 4.2.1.
Starting from orthogonal matrices, we recovered the first order expansion of rotations. But this should only work for special orthogonal matrices. It looks like we never imposed the condition that \(\det A = 1\text{.}\) Why?
The key insight of Lie is that we can now recover finite rotations from the knowledge of the generator \(X\) alone. Let \(R(\theta)\) be a rotation by a finite angle \(\theta\text{.}\) Pick a large integer \(N\text{.}\) We can recover \(R(\theta)\) by doing \(R(\theta/N)\) \(N\) times. In the limit as \(N \to \infty\text{,}\) \(R(\theta/N)\) becomes an infinitesimal rotation. Thus we get:
If we naively use the relation \(e^x = \lim_{N \to \infty} (1+ \frac{x}{N})^N\) for matrices, we would conclude that
In other words, we can recover finite rotations by exponentiating the infinitesimal generator! That is cool. This is due to the group structure, which says that we can recover rotations by successively repeating rotations of smaller angles.
Let us be a little more precise and prove this explicitly, which makes the relation with the group structure more explicit.
Lemma 4.2.2. The generator of two-dimensional rotations.
Let
We define the exponential of a matrix \(A\) through the series:
Then any two-dimensional rotation \(R(\theta)\) can be written in terms of the generator \(X\) via exponentiation:
Proof.
Let us start with a two-dimensional rotation \(R(\theta)\text{.}\) Since it is a Lie group, we can Taylor expand near the identity:
(By derivative of a matrix here we mean derivative of its entries.) Let us now identity the derivatives of the rotation matrix. Rotations form a Lie group. The group structure is given by composition of rotations, which can be written as the requirement that
Taking the derivative on both sides with respect to \(\theta_1\text{,}\) and then setting \(\theta_1 = 0\text{,}\) we get
We can calculate the left-hand-side of (4.2.2) via the chain rule:
For the right-hand-side of (4.2.2), we define the matrix
Thus (4.2.2) becomes
In particular,
In fact, taking repeated derivatives of (4.2.3), we get:
Evaluating at \(\theta=0\text{,}\) we get, by induction on \(n\text{,}\)
Thus all derivatives of the rotation matrix at the origin are determined by the generator \(X\text{!}\) As is clear from the proof, this follows because of the group structure of rotations.
Plugging this back into (4.2.1), with \(X^0 := I\text{,}\) we get
which concludes the proof.
The cool thing here is that all two-dimensional rotations can be recovered by exponentiating the infinitesimal generator \(X\text{.}\) So instead of studying the Lie group \(SO(2)\) directly, we could instead study the properties of the generator \(X\text{.}\) This is our first example of a Lie algebra. We say that \(X\) is an element of the Lie algebra \(\mathfrak{so}(2)\text{:}\) we use the weird font to distinguish the Lie algebra from the Lie group. The algebra is rather trivial however here, since it has only one generator. Our next example will be less trivial.
Subsection 4.2.2 Rotations in three dimensions
Let us now consider the Lie group \(SO(3)\) consisting of three-dimensional rotations. We think of those as \(3 \times 3\) special orthgonal matrices \(A\text{.}\) We do an infinitesimal expansion \(A \cong I + M\) for an infinitesimal \(M\text{.}\) Then the orthogonality condition becomes
Therefore, as for two-dimensional rotations, we conclude that \(M\) is a real antisymmetric \(3 \times 3\) matrix. In fact, it is customary in physics to introduce a factor of \(i\) in our linearization, and define instead \(A \cong I + i L\text{.}\) Then what we have shown is that \((i L)^T = - i L\text{.}\) Since \((i L)^T\) is a real matrix, then it is equal to its complex conjugate \(-i L^\dagger\text{.}\) Thus the condition that \((i L)^T = - i L\) can be rewritten as \(L^\dagger = L\text{,}\) that is, \(L\) is a purely imaginary Hermitian \(3 \times 3\) matrix.
Any \(3 \times 3\) purely imaginary Hermitian matrix can be written as a linear combination of three matrices:
Those are the infinitesimal generators of the Lie group \(SO(3)\text{.}\) We can write an arbitrary infinitesimal rotation as \(i \theta_1 L_1 + i \theta_2 L_2 + i \theta_3 L_3\) for three real parameters \(\theta_1, \theta_2, \theta_3\) (we thus see that \(SO(3)\) is a three-dimensional Lie group). Following the same argument as for rotations in two dimensions, we can also conclude that we can write an arbitrary finite rotation in three dimensions (i.e. an element of the Lie group \(SO(3)\)) by exponentiation:
The algebra generated by the generators \(L_1, L_2, L_3\) is the Lie algebra \(\mathfrak{so}(3)\text{.}\) Now that we have more than one generator, this is more interesting. How do we define the abstract properties of this algebra? (Instead of writing down an explicit representation in terms of three-dimensional matrices.) For this, we need one more element: the notion of a binary operation on the algebra, which, in matrix form, will be given by the commutator. Let us see how this comes about.
Subsection 4.2.3 Commutation and the commutator
What we have seen so far is that for rotations in two and three dimensions, we can reconstruct the group elements by exponentiating the infinitesimal generators. This is in fact a general statement for all Lie groups, as we will see in the next section. In the context of three-dimensional rotations, we found an explicit representation for the generators in terms of \(3 \times 3\) matrices. But just as when we defined abstract groups, we would like to obtain an abstract definition of the algebra of generators of a Lie group. For this, there is one element missing. In general, rotations do not commute. How can we see that from the point of view of the infinitesimal generators?
Let \(R \cong I + M\) and \(R' \cong I + M'\) be infinitesimal rotations. Then
where we neglected terms of higher order. If we define the commutator \([M,M']\) as
then non-commutativity of \(R\) and \(R'\) is encapsulated in the statement of whether the commutator of their infinitesimal generators vanishes or not.
Thus, to encode the group structure of a Lie group in terms of the abstract notion of the algebra of its infinitesimal generators, we need to specify the commutation relations between the generators. This will give an abstract definition of a Lie algebra, from which a Lie group can be obtained by exponentiation.
For three-dimensional rotations, looking at the representation of the generators \(L_1, L_2, L_3\) in terms of \(3 \times 3\) matrices that we found, it is easy to compute that
Checkpoint 4.2.3.
Check that the generators of three-dimensional rotations \(L_1,L_2,L_3\) satisfy these commutation relations.
This can be encoded neatly using the Levi-Civita symbol \(\epsilon_{ijk}\text{,}\) which is defined as:
Then:
As we can see, the commutator closes, since the right-hand-side is a linear combination of the generators. Thus it provides a bilinear operation \(L \times L \to L\) on the vector space \(L\) spanned by \(L_1, L_2, L_3\text{.}\) This is generally true, as we will see. For any Lie algebra, we are given a bilinear operation, which we write as a commutator, such that
We call the \(c_{ijk}\) the structure constants of the Lie algebra.
Abstractly, we can define the Lie algebra \(\mathfrak{so}(3)\) as being the three-dimensional vector space \(V\) of real linear combinations of the generators \(L_1,L_2,L_3\text{,}\) with a bilinear operation \([\cdot, \cdot]: V \times V \to V\) specified by (4.2.5).
Subsection 4.2.4 Differential representation
¶So far we have worked exclusively with matrix representations of the rotation groups. In fact, we defined the rotation groups in terms of their fundamental, or defining, representations, as subgroups of \(GL(n,\mathbb{R})\text{.}\)
But there are other types of representations that are very useful. Let us focus on three-dimensional rotations as an example. We can represent the infinitesimal generators of rotations as differential operators acting on functions \(f(x,y,z)\) on \(\mathbb{R}^3\text{.}\) To do that, what we need to do is find differential operators that satisfy the commutation relations (4.2.5). It is not too difficult to check that the following differential representation works:
Checkpoint 4.2.4.
Check that the differential operators \(L_1,L_2,L_3\) satisfy the commutation relations (4.2.5).
You may recognize those operators as the angular momentum operators in quantum mechanics (up to a factor of \(\hbar\)). This is not a coincidence! The angular momentum operators in quantum mechanics are a representation of the infinitesimal generators of the group of rotations in three dimensions. In fact, the possibility of going back and forth between differential representations and matrix representations of the Lie algebra is the essence of the duality between the Schrodinger and Heisenberg pictures of quantum mechanics.