1. Introduction
Algebra is the mathematical discipline that arose from the problem of solving equations [1]. If one starts with the integers Z, one knows that every equation
, where a and b are integers, has a unique solution. However, the equation
does not necessarily have a solution in Z, or it might have infinitely many solutions (take
). So let us enlarge Z to the rational numbers Q, consisting of all fraction’s c/d, where
. Then both equations have a unique solution in Q, provided that
for the equation
. So, Q is a field. If, for example, one takes the solutions of an equation such as
and forms the set of all numbers of the form
, where a and b are rational, we get a larger field, denoted by
, called an algebraic number field. In the study of fields obtained by adjoining the roots of polynomial equations, a new notion arose, namely, the symmetries of the field that permute the roots of the equation. Evariste Galois (1811-1832) coined the term group for these symmetries, and now this group is called the Galois group of the field. While still a teenager, Galois showed that the roots of an equation are expressible by radicals if and only if the group of the equation has a property now called solvability. This stunning result solved the 350-year-old question whether the roots of every polynomial equation are expressible by radicals.
1Linear Groups is the Accent on Infinite Dimensionality explores some of the main results and ideas in the study of infinite-dimensional linear groups. The theory of finite-dimensional linear groups is one of the best developed algebraic theories. The array of articles devoted to this topic is enormous, and there are many monographs concerned with matrix groups, ranging from old, classical texts to ones published more recently. However, in the case when the dimension is infinite (and such cases arise quite often), the reality is quite different.
The situation with the study of infinite-dimensional linear groups is like the situation that has developed in the theory of groups, in the transition from the study of finite groups to the study of infinite groups which appeared about one hundred years ago. It is well known that this transition was extremely efficient and led to the development of a rich and central branch of algebra: Infinite group theory.
Group theory arose from the study of polynomial equations [2]. The solvability of an equation is determined by a group of permutations of its roots; before Abel [1824] and Galois [1830] mastered this relationship, it led Lagrange [1770] and Cauchy [1812] to investigate permutations and prove forerunners of the theorems that bear their names. The term “group” was coined by Galois. Interest in groups of transformations, and in what we now call the classical groups, grew after 1850; thus, Klein’s Erlanger Programmed [1872] emphasized their role in geometry. Modern group theory began when the axiomatic method was applied to these results; Burnside’s Theory of Groups of Finite Order [1897] marks the beginning of a new discipline, abstract algebra, in that structures are defined by axioms, and the nature of their elements is irrelevant.
Definition 1.1. A group is a set G, together with a map of
into G with the following properties:
· Closure: For all
, We have
· Associativity: For all
.
· There exists an element e in G such that for all
,
The element e is unique, and is called the identity element of the group, or simply the identity.
· and such that for all
, there exists
with
is called an inverse of x.
Definition 1.2. A group G is said to be commutative or abelian if all
, we have
. A group that is not abelian is said to benonabelian.
Proposition 1.3. (Uniqueness of the Identity). Let G be a group, and let
be such that for all
.
Then
.
Proof. Since e is an identity, we have
On the other hand, since f is an identity, we have
Thus
.
Proposition 1.4 (Uniqueness of Inverses). Let G be a group, e the (unique) identity of G, and
. Suppose that
Then
.
Proof: We know that
. Multiplying on the left by h gives
.
By as sociativity, this gives
and so
.
Proposition 1.5. [1] For all
, we have
.
Proof: Let
. Then it suffices to show that
. But
If
are arbitrary elements of a group G, then the expression
will stand for
, where
and soon. This gives an inductive definition of the product of an arbitrary finite number of elements of Moreover, by as sociativity, pairs (
) of parentheses can be inserted or removed in the expression
without making any change in the group element being represented, provided the new expression makes sense. (For example, you can’t have an empty pair of parentheses, and the number of left parentheses has to be the same as the number of right parentheses).
Definition 1.6. A subgroup of a group G is a subset H of G with the following properties:
1) The identity is an element of H.
2) If
, then
.
3) If
, then
.
Proposition 1.7. A subset H of a group G is a subgroup if and only if H is a group under the group operations of G. That is, H is closed under the group operation and contains the identity of G, and the inverse of an element of H is its inverse in G.
Definition 1.8. Let G and H be groups. A homomorphism from G to H is a map
such that, for all
in G,
.
Proposition 1.9. Let G and H be groups and
be a homomorphism, Then for all
in G,
and
Proof: We have
Since
is a homomorphism.
Definition 1.10. A homomorphism from G to H which is a bijection is an isomorphism. In that case, we say that G and H are isomorphic, and write
.
Definition 1.11. A bijective homomorphism
from a group to itself is an automorphism.
Example 1.12. Let
denote the nonzero real numbers. Multiplication by
defines a bijection
given by
. The distributive law for R says that
for all
. Thus
is a homomorphism for the additive group structure on R. Since
, a is in fact an isomorphism. Furthermore, the associative and commutative laws for R imply that
. Hence,
.
2. The Classical Groups
In this study the structure of a classical group G and its Lie algebra [3]. We choose a matrix realization of G such that the diagonal subgroup
is a maximal torus; by elementary linear algebra every conjugacy class of semi simple elements intersects H. Using the unipotent elements in G, we show that the groups GL(n,R), SL(n,R), SO(n,R), and U(n,C) are connected (as Lie groups and as algebraic groups). This group and its Lie algebra play a basic role in the structure of the other classical groups and Lie algebras. We decompose the Lie algebra of a classical group under the adjoint action of a maximal torus and find the invariant subspaces (called root spaces) and the corresponding characters (called roots). The commutation relations of the root spaces are encoded by the set of roots; we use this information to prove that the classical (trace-zero) Lie algebras are simple (or semi simple). In the final section of the chapter, we develop some general Lie algebra methods (solvable Lie algebras, killing form) and show that every semi simple Lie algebra has a root-space decomposition with the same properties as those of the classical Lie algebras.
Definition 1.13. The classical groups are the groups of invertible linear transformations of finite-dimensional vector spaces over the real, complex, and quaternion fields, together with the subgroups that preserve a volume form or a bilinear form.
Proposition 1.14. The determinant function
: has the following properties.
i) For
,
.
ii)
.
iii)
is invertible if and only if
.
3. General Linear Groups GL(n,R)
· Closure property
Let
then
i.e.
i.e.
· Associative property
· Existence of identity
. Such that
· Existence of inverse
Let
,
.
· Commutativity
group is not Abellio group because
.
Let F denote either the field of real numbers R or the field of complex numbers C, and let V be a finite-dimensional vector space over F [3]. The set of invertible linear transformations from V to V will be denoted by GL(V). This set has a group structure under composition of transformations, with identity element the identity transformation
for all
. The group GL(V) is the first of the classical groups. To study it in more detail, we recall some standard terminology related to linear transformations and their matrices. Let V and W be finite-dimensional vector spaces over F. Let
and
be bases for V and W, respectively. If
is a linear map then
with
. The numbers
are called the matrix coefficients or entries of T with respect to the two bases, and the
array
Let
be another linear transformation, with U an l-dimensional vector space with basis
, and let B be the matrix of S with respect to the bases
and
, Then the matrix of
with respect to the bases
and
, is given by BA, with the product being the usual product of matrices.
We denote the space of all
matrices over F by
, and we denote the
identity matrix by I (or in if the size of the matrix needs to be indicated); it has entries
if
and 0 otherwise. Let V be an n-dimensional vector space over F with basis
. If
is a linear map we write
for the matrix of T with respect to this basis. If
then the preceding observations imply that
. Furthermore, if
then
. The matrix
is said to be invertible if there is a matrix
such that
. We note that a linear map
is in
if and only if its matrix
is invertible. We also recall that a matrix
is invertible if and only if its determinant is nonzero.
We will use the notation
for the set of n_n invertible matrices with coefficients in F. Under matrix multiplication
is a group with the identity matrix as identity element. We note that if V is an n-dimensional vector space over F with basis
, then the map
corresponding to this basis is a group isomorphism. The group
is called the general linear group of rank n.
If
is another basis of V, then there is a matrix
such that
with
the inverse matrix to
. Suppose that T is a linear transformation from V to V, that
is the matrix of T with respect to a basis
, and that
is the matrix of T with respect to another basis
. Then
for
. Thus
is similar to the matrix A.
4. Special Linear Group SL(n,R)
· Closure property
Let
then
i.e.
i.e.
· Associative property
· Existence of identity
Such that
· Existence of inverse
Let
,
.
· Commutativity
is a group but not Abellio group because
.
The special linear group
is the set of all elements A of
such that
[3]. Since
and
, we see that the special linear group is a subgroup of
.
We note that if V is an n-dimensional vector space over F with basis
. and if
is the map previously defined, then the group
is independent of the choice of basis, by the change of basic formula. We denote this group by SL(V).
5. Orthogonal and Special Orthogonal Group
Definition 4.1. The matrix obtained by replacing the same numbered rows and columns of a matrix A is called the transposition of A and is denoted by AT. Accordingly,
,
-matrix,
,
-matrix.
Definition 4.2.
1) A square matrix with
is called a symmetric matrix. If
is a symmetric matrix then
for each
.
2) A square matrix with
is called an inverse symmetric matrix. If
is an inverse symmetric matrix then
for each
. Thus, in an inverse symmetric matrix, the prime diagonal elements are always zero.
Theorem 4.3. A, B are two matrices of the same order and r is a scalar.
1)
2)
3)
4)
Theorem 4.4. A square matrix A is orthogonal if and only if its column vectors form an orthonormal set.
Proof: Let A be an
orthogonal matrix and let
be the column vectors of A. Then
Therefore,
if and only if
A simple example 3.5. The matrix
is orthogonal matrix. Note that
Consider the determinant function restricted to
,
.
For
,
which implies that
. Thus, we have
where
6. Special Orthogonal Group
The SO(n) is a subgroup of the orthogonal group O(n) and also known as the special orthogonal group or the set of rotations group.
The orthogonal and special orthogonal groups, O(n) and SO(n) [4]. An
real matrix A is said to be orthogonal if the column vectors that make up A are orthonormal, that is, if
Equivalently, A is orthogonal if it preserves the inner product, namely, if
for all vectors
in
. (Angled brackets denote the usual inner product on
,
) Still another equivalent definition is that A is orthogonal if
, i.e., if
. (
is the transpose of A,
.).
Since
, we see that if A is orthogonal, then
. Hence
, for all orthogonal matrices A.
This formula tells us, in particular, that every orthogonal matrix must be invertible. But if A is an orthogonal matrix, then
Thus, the inverse of an orthogonal matrix is orthogonal. Furthermore, the product of two orthogonal matrices is orthogonal, since if A and B both preserve inner products, then so does AB. Thus, the set of orthogonal matrices forms a group
The set of all
real orthogonal matrices is the orthogonal group O(n), and is a subgroup of
. The limit of a sequence of orthogonal matrices is orthogonal, because the relation
is preserved under limits. Thus O(n) is a matrix Lie group.
The set of
orthogonal matrices with determinant one is the special orthogonal group SO(n). Clearly this is a subgroup of O(n), and hence of
.
Moreover, both orthogonality and the property of having determinant one is preserved under limits, and so SO(n) is a matrix Lie group. Since elements of O(n) already have determinant ±1, SO(n) is “half” of O(n).
7. Unitary Groups
For
,
,
is the Hermitian conjugate of A, i.e.,
.
Definition 3.5. [5] In a matrix A whose elements are complex numbers, the matrix obtained by replacing each element with its conjugate is called the conjugate of A matrix and is denoted by
.
Simple Example 5.1:
conjugate of the matrix
Theorem 5.2. Let A, B be two matrices and k any scalar.
1)
2)
3)
4)
Example 5.3: Let
and
Note that,
Definition 5.4. A square matrix with
is called Hermitian matrix. If the square matrix
is the Hermitian matrix, then
. The diagonal elements of a Hermitian matrix are real numbers.
Simple Example 5.5:
matrix is a Hermitian matrix.
Definition 5.6: [5] A square matrix with
is called the inverse Hermitian matrix. If the square matrix
is the inverse Hermitian matrix, then
. The diagonal elements of an inverse Hermitian matrix are 0 or their imaginary numbers.
Simple Example 5.7: Let
matrix is the inverse Hermitian matrix.
Example 5.8: Let
Theorem 5.9: Let

if and only if
An
complex matrix A is said to be unitary if the column vectors of A are orthonormal [4], that is, if
Equivalently, A is unitary if it preserves the inner product, namely, if
for all vectors x, y in
. (Angled brackets here denote the inner product on
,
. We will adopt the convention of putting the complex conjugate on the left.) Still another equivalent definition is that A is unitary if
, i.e., if
. (
is the adjoint of A,
).
Since
, we see that if A is unitary, then
. Hence
, for all unitary matrices A. This in particular shows that every unitary matrix is invertible. The same argument as for the orthogonal group shows that the set of unitary matrices forms a group.
This in particular shows that every unitary matrix is invertible. The same argument as for the orthogonal group shows that the set of unitary matrices forms a group. The set of all
unitary matrices is the unitary group U(n), and is a subgroup of
. The limit of unitary matrices is unitary, so U(n) is a matrix Lie group. The set of unitary matrices with determinant one is the special unitary group SU(n). It is easy to check that SU(n) is a matrix Lie group. Note that a unitary matrix can have determinant
for any θ, and so SU(n) is a smaller subset of U(n) than SO(n) is of O(n). (Specifically, SO(n) has the same dimension as O(n), whereas SU(n) has dimension one less than that of U(n)).
8. Symplectic Groups
real skew symmetric matrix A [6], i.e., one for which
. For such a matrix,
giving
The most interesting case occurs if
when n must be even and we then write n = 2m. The standard example of this is built up using the 2 × 2 block
If
we have the non-degenerate skew symmetric matrix
The matrix group
Is called the
(real) symplectic group.
We will now look at the coordinate-free version of these groups. A bilinear form B is called skew symmetric if
. If B is skew-symmetric and non-degenerate, then
must be even, since the matrix of B relative to any basis for V is skew-symmetric and has nonzero determinant.
9. Cyclic Groups
The next class of groups we will consider consists of the cyclic groups. Before defining these groups, we need to explain how exponents work. If G is a group and
, then if m is a positive integer,
(m factors). We define
to be
. Also,
. Then the usual laws of exponentiation hold for all integers m, n:
1)
2)
Definition 7.1. A group G is said to be cyclic if there exists an element
such that for every
, there is an integer m such that
. Such an element x is said to generate G.
10. Dihedral Groups: Generators and Relations
In the next example, we give an illustration of a group G that is described by giving a set of its generators and the relations the generators satisfy. The group we will study is called the dihedral group. We will see in due course that the dihedral groups are the symmetry groups of the regular polygons in the plane. As we will see in this example, defining a group by giving generators and relations does not necessarily reveal much information about the group.
Example 8.1. [2] Let us now verify that D(2) is a group. Since the multiplication of words is associative, it follows from the requirement that
and
that every word can be collapsed to one of
. But
, so
. To see that D(2) is closed under multiplication, we observe that
Therefore, D(2) is closed under multiplication, so it follows from our other remarks that D(2) a group. Note that the order of D(2) is 4.
Example 8.2. [2] (Dihedral Groups) The dihedral groups are groups that are defined by specifying two generators a and b and also specifying the relations that the generators satisfy. When we define a group by generators and relations, we consider all words in the generators, in this case a and b: these are all the strings or products
, where each xi is either a or b, and n is an arbitrary positive integer. For example, abbaabbaabbaabba is a word with n = 16. Two words are multiplied together by placing them side by side.
Thus,
This produces an associative binary operation on the set of words. The next step is to impose some relations that a and b satisfy. Suppose m > 1. The dihedral group D(m) is defined to be the set of all words in a and b with the above multiplication that we assume is subject to the following relations:
It is understood that the cyclic groups
and
have orders m and two respectively. By,
and
. For example, if m = 3, then
, so
The reader can show that
. For example,
. Hence, D(3) has order 6. We will givea more convincing argument in due course.
11. Quaternionic Groups
We recall some basic properties of the quaternions [3]. Consider the four-dimensional real vector space H consisting of the 2 × 2 complex matrices
with
.
One checks directly that H is closed under multiplication in
. If
then
and
(where w denotes the conjugate-transpose matrix). Hence every nonzero element of H is invertible. Thus H is a division algebra (or skew field) over R. This division algebra is a realization of the quaternions. The more usual way of introducing the quaternions is to consider the vector space H over R with basis
. Define a multiplication so that 1 is the identity and
then extend the multiplication to H by linearity relative to real scalars. To obtain an isomorphism between this version of H and the 2_2 complex matrix version, take
where i is a fixed choice of
. The conjugation
satisfies
. In terms of real components,
for
. It is useful to write quaternions in complex form as
with
; however, note that the conjugation is then given as
On the 4n-dimensional real vector space
we define multiplication by
on the right:
We note that
and
. We can therefore think of
as a vector space over H. Viewing elements of
as
column vectors, we define Au for
and
by matrix multiplication. Then
for
; hence A defines a quaternionic linear map. Here matrix multiplication is defined as usual, but one must be careful about the order of multiplication of the entries.
We can make
into a 2n-dimensional vector space over C in many ways; for example, we can embed C into H as any of the subfields
Using the first of these embeddings, we write
with
, and likewise
with
. The maps
identify
with
and
with the real subalgebra of
consisting of matrices T such that
where
The Future Perspective of This Paper
The future of this paper is to help some to better understand the introductory part of linear groups, while others, like me, to be an incentive in further work and engaging in science.
NOTES
1https://www.routledge.com/Linear-Groups-The-Accent-on-Infinite-Dimensionality/Dixon-Kurdachenko-Subbotin/p/book/9781138542808.