### Characters and Expansion

*In which we look at the representation theory of Abelian groups and at the discrete Fourier transform, we completely understand the eigenvalues and eigenvectors of the Cayley graph of an Abelian group, and we apply this powerful theory to prove that the cycle is a connected graph.*

When we talked about group representations, we saw how to map each element of a group into a matrix in such a way that group addition becomes matrix multiplication.

In general, if $G$ is a finite group, then an

*n-dimensional representation*of $G$ is a map

$\rho: G \rightarrow {\mathbb C}^{n\times n}$

such that $\rho$ is a group homomorphism between $G$ and the group of matrices $\rho(G)$. That is, we have that $\rho(g)$ is an invertible matrix for every $g$, $\rho(0)=I$, $\rho(a+b) = \rho(a) \times \rho(b)$ and $\rho(-g) = (\rho(g))^{-1}$.

(Scott will be delighted to know that the theory need not stop at finite matrices, and one may also look at representations mapping group elements into invertible linear transformations over an infinite-dimensional Hilbert space. To study finite groups, however, finite-dimensional representations are good enough.)

Note that $\rho$ need not be injective, and so, for every group, we can always define the 1-dimensional representation $\rho(g) = 1$ for every $g$. Perhaps unsurprisingly, this is usually called the

*trivial*representation. The precise statement of a theorem of Frobenius that we mentioned earlier is that every non-trivial representation of $PSL(2,p)$ has dimension at least $(p-1)/2$.

In an Abelian group, however, there are plenty of 1-dimensional representations, and, in fact, all the information about the group is "encoded" in the set of its 1-dimensional representations. We will call a 1-dimensional representation a

*character*of the group. (This equivalence holds only for finite groups, otherwise we need an extra condition.) Hence a character of a group $G$ is a mapping

$\chi: G \rightarrow {\mathbb C}$

such that $\chi(g) \neq 0$ for every $g$, $\chi(0)=1$, $\chi(a+b) = \chi(a)\cdot \chi(b)$ and $\chi(-g) = (\chi(g))^{-1}$. In a finite group $G$, we always have the property that if we add an element to itself $|G|$ times we get $0$; from this we can deduce that, if $\chi()$ is a character, then $\chi(g)$ must be a $|G|$-th root of 1, that is, it must be a number of the form $e^{2\pi x i/|G|}$ for some $x$. In particular, $|\chi(g)|=1$ for every $g$.

(For general groups, we say that $\chi$ is a character if it is a 1-dimensional representation

*and*it has the property that $|\chi(g)|=1$ for every $g$. While the latter property is a consequence of the former for finite groups and, more generally, for torsion groups, this is not the case for other groups. For example, $\chi(x) := e^{x}$ is a 1-dimensional representation for $\mathbb Z$ but it is not a character.)

At first sight, it might look like a finite group might have an infinite number of characters (or at least that's what I thought when I first saw the definition). There is however a simple argument that shows that a finite group $G$ cannot have more than $|G|$ characters.

Consider the set of all functions of the form $f: G \rightarrow {\mathbb C}$; we can think of it as a $|G|$-dimensional vector space over $\mathbb C$. We have that the characters of $G$ are linearly independent, and so there cannot be more than $|G|$ of them. In fact, even more is true: the characters of $G$ are

*orthogonal*to each other. For two functions $f,h$, define their inner product as

$(f,g) := \frac1 {|G|} \sum_{a\in G} f(a) \overline{g(a)} $

Then, if $\chi$ and $\eta$ are two different characters, we have

$(\chi,\eta) = \frac1 {|G|} \sum_{a\in G} \chi(a) \overline{\eta(a)}=0 $

The normalization in the definition of inner product is convenient because then we also have $(\chi,\chi)=1$ if $\chi$ is a character. (We will not prove these claims, but the proofs are very simple.)

We will now show that the cyclic group ${\mathbb Z}_N$ has indeed $N$ distinct characters $\chi_0,\ldots,\chi_{N-1}$. This means that this set of characters forms an orthonormal basis for the set of functions $f: {\mathbb Z}_N \rightarrow {\mathbb C}$, and that each such function can be written as a linear combination

$f(x) = \sum_{c=0}^{N-1} \hat f(c) \chi_c (x)$ (*)

where the coefficients $\hat f(c)$ can be computed as the inner product

$\hat f(c) = (f,\chi_c)$

(The equation (*) is the Fourier expansion of $f$, and the function $\hat f()$ is called the Fourier transform of $f$.)

Here is how to define the characters $\chi_c$: for each $c$, define $\chi_c(x):= e^{2\pi c x /N}$. It is immediate to verify that each such function is a character, and that, for $c=0,\ldots,N-1$ we are, indeed, defining $N$ different functions. By the previously mentioned results, there is no other function.

Let's see one more example: consider $({\mathbb Z}_2)^n$, that is, $\{0,1\}^n$ with the operation of bit-wise XOR. For each ${\mathbf a} = (a_1,\ldots,a_n)\in \{0,1\}^n$, define the function

$\chi_{\mathbf a} (x_1,\ldots,x_n) := (-1)^{\sum_i a_i x_i }$

We can again verify that all these $2^n$ functions are distinct, that each of them is a character, and so that they include all the characters of $({\mathbb Z}_2)^n$. The reader should be able to see the pattern and to reconstruct the $N_1\times \cdots N_k$ characters of the group

${\mathbb Z}_{N_1} \times \cdots \times {\mathbb Z}_{N_k}$

thus having a complete understanding of the set of characters of any given finite Abelian group.

At long last, we can get to the subject of expansion.

Let $A$ be a finite Abelian group, and $S$ be a set of generators such that if $a\in S$ then $-a\in S$. Definite the Cayley graph $G=(A,E)$ where every element of $A$ is a vertex and where $(a,b)$ is an edge if $a-b\in S$. This is cleary an $|S|$-regular undirected graph. Let $\chi_1,\ldots,\chi_{|A|}$ be the set of characters of $A$. Think of each character $\chi_i$ as an $|A|$-dimensional vector. (The vector that has the value $\chi_i(a)$ in the $a$-th coordinate.) Then these vectors are orthogonal to each other. We claim that they are also

*eigenvectors*for the adjacency matrix of $G$.

Note a surprising fact here: the graph $G$ depends on the group $A$ and on the choice of generators $S$. The eigenvectors of the adjacency matrix, however, depend on $A$ alone. (The eigenvalues, of course, will depend on $S$.)

The proof of the claim is immediate. Let $M$ be the adjacency matrix of $G$, then the $b$-th entry of the vector $\chi_i \times M$ is

$\sum_{a\in A} \chi_i(a) M(a,b) = \sum_{s\in S} \chi_i(b+s) =$

$ \sum_{s\in S} \chi_i(b)\chi_i (s) = \chi_i (b) \sum_{s\in S} \chi_i(s)$

We are done! The vector $\chi_i$ is indeed an eigenvector, of eigenvalue $\sum_{s\in S} \chi_i(s)$. Since we have $|A|$ eigenvectors, and they are linearly independent, we have found all eigenvalues, with multiplicities.

Let's look at the cycle with $N$ vertices. It is the Cayley graph of ${\mathbb Z}_N$ with generators $\{-1,1\}$. We have $N$ eigenvalues, one for each $c=0,\ldots,N$, and we have

$\lambda_c = \chi_c(-1) + \chi_c(1) = e^{-2\pi c i/N} + e^{-2\pi c i/N}$

this may seem to contradict our earlier claim that all eigenvalues were going to be real. But two lines of trigonometry show that, in fact,

$\lambda_c = 2 \cos (2\pi c/N)$

As expected in a 2-regular graph, the largest eigenvalue is $\lambda_0 = 2\cos(0)=2$. The second largest eigenvalue, however, is $2\cos(2\pi /N$, which is $2-\Theta(1/N^2)$. This means that the expansion of the cycle is at least $\Omega(1/N^2)$ and, in particular, the cycle is a connected graph!

(After we are done having a good laugh at the expense of algebraic graph theory, I should add that this calculation, plus some relatively easy facts, implies that a random walk on the cycle mixes in $O(N^2)$ steps. Not only this is the right bound, but it is also something that does not have a completely straightforward alternative proof.)

We do a little better with the hypercube. The hypercube is the Cayley graph of the group $({\mathbb Z}_2)^n$ with the generator set that contains all the vectors that have precisely one 1. Now we have one eigenvalue for each choice of $(a_1,\ldots,a_n)$, and the corresponding eigenvalue is

$\sum_{i=1}^n (-1)^{a_i}$.

The largest eigenvalue, when all the $a_i$ are zero, is $n$, as expected in an $n$-regular graph. The second largest eigenvalue, however, is at most $n-2$. This means that the expansion of the hypercube is at least 1. (As it happens, it is precisely 1, as shown by a dimension cut.)

## 4 Comments:

12/23/2006 04:42:00 PM

Thanks for these posts!

Can you sketch the proof that the characters of an arbitrary finite group are orthogonal?

12/23/2006 04:57:00 PM

First we note that, if $c$ is a complex number such that $|c|=1$, then $\overline{c} = c^{-1}$. Now, suppose that $\chi()$ and $\eta()$ are two different characters, then proving that they are orthogonal is the same as proving

$\sum_x \chi(x) \eta^{-1}(x) = 0$

Next, we see that the mapping of $x$ into $\chi(x)\eta^{-1}(x)$ is also a character, and that, since $\chi$ and $\eta$ are distinct, it is not the always-1 character. So it remains to prove that if $\alpha$ is a non-trivial character then

$\sum_x \alpha(x) = 0$

Let $b$ be such that $\alpha(b)\neq 1$. We are going to prove that

$(1-\alpha(b))\sum_x \alpha(x) = 0$

which gives the desired result. But this is equivalent to

$\sum_x \alpha(x) = \sum_x \alpha(x+b)$

which is clearly true.

12/24/2006 11:12:00 AM

In his introductory lecture on representation theory in his graduate algebra class, Elkies jokingly dismissed Fourier analysis as merely one-dimensional representation theory. He had us compute the character table for finite abelian groups on the first problem on the homework. That was also the semester when I first encountered Fourier analysis in a required course on signals and systems.

By then, I had also learnt to partition my wardrobe into two sections: one for Mondays, Wednesdays and Fridays, when I had classes at Harvard, and the other for the rest of the week. T-shirts bearing company logos that I had picked up at career fairs fall strictly into the latter.

12/27/2006 06:35:00 PM

Thanks for these very informative posts!

Post a Comment

<< Home