# Basics of Graph Convolutional Networks

This article discusses what is convolutional operation on graph domain and some spectral based graph convolutional networks.

# Overview

Recently，Graph Neural Networks (GNNs) have attracted more and more attention because of the great expressive power of graphs, and many GNN methods have demonstrated ground-breaking performance on many tasks.

There are two fundamental motivations of GNNs. The first motivation of GNNs roots in Convolutional Neural Networks (CNNs). CNNs have the ability to generate expressive representations and extract multi-scale localized spacial features. The other motivation comes from graph embedding, which suffer two severe drawbacks, namely, no parameter sharing which leads to computationally inefficiency and the lack of generalizing ability.

In this article, the basics of how to introduce CNNs into graph domain will be demonstrated. The outline is …

# Discrete Fourier Transform（DFT）

We will start from the definition of DFT and we will explain how to use DFT to perform convolution.
Assume we have a function $f(t)$ with $N$ sample points, then the DFT of this function $f(t)$ is defined as follows:

where $\hat{f}(\omega)$ is the DFT function of $f(t)$.
And the inverse DFT of $\hat{f}(\omega)$ is given by

The convolution of two function $f$ and $g$ is defined by

And the convolution theorem gives

# Laplacian Matrix of Graphs

For a unweighted graph $\mathcal{G}=(\mathcal{V},\mathcal{E})$ with adjacency matrix $\mathbf{A}$ and degree matrix $\mathbf{D}$, its Laplacian matrix is defined as $\mathbf{L}=\mathbf{D}-\mathbf{A}$. And the symmetric normalized Laplacian matrix is define as $\mathbf{L}^{sym}=\mathbf{D}^{1/2}\mathbf{L}\mathbf{D}^{-1/2}$, which is used in most literature, hence we will use $\mathbf{L}$ represents $\mathbf{L}^{sym}$ for short.
The Laplacian matrix is a real symmetric semi-definite matrix, which means it admits eigendecomposition, and it can be expressed as

where $\mathbf{U}$ is the eigenvector matrix and $\lambda_i$ is the $k$th largest eigenvalue of $\mathbf{L}$. Since $\mathbf{L}$ is a real symmetric matrix, $\mathbf{U}$ must be orthogonal, which mean we can construct a orthonormal basis from it. Besides, we can verify that all eigenvalues are non-negative. For eigenvalue $\lambda$ we have

In DFT, we are transforming the function from time domain into spectral domain, where $\omega$ can be seen as the spectrum and $e^{-j\omega t}$ can be seen as a basis of the spectral domain. Therefore, in graph domain, we can define $\lambda$ as the spectrum and the eigenvectors as a basis, through which we can apply convolutional operation in graph spectral domain.
Analogously to DFT, we can define the graph Fourier transform $\hat{\mathbf{f}}$ of any function $\mathbf{f}\in\mathbb{R}^N$ on the vertices of $\mathcal{G}$ as follows:

where $\mathbf{u}_l(i)$ denotes the $i$th element of $l$th eigenvector. And the inverse graph Fourier transform is then given by

In matrix form, we can have

and

In classical Fourier analysis, $\omega$ carries a specific notion of frequency: for $\omega$ closes to zero (low frequencies), the associated eigenfunction $e^{-j\omega t}$ is smooth, slowly oscillating function, whereas for $\omega$ far from zero (high frequencies), the associated eigenfunction oscillate much more rapidly. In the graph setting, the graph Laplacian eigenvalues and eigenvectors provide a similar notion of frequency. For connected graphs, the Laplacian eigenvector $\mathbf{u}_0$ associated with the eigenvalue 0 is a all-one vector. The graph Laplacian eigenvectors associated with low frequencies $\lambda_l$ vary slowly across the graph, and the eigenvectors associated with large eigenvalues oscillate more rapidly.
Adopting the convolution theorem, for two function $\hat{\mathbf{f}}$ and $\hat{\mathbf{h}}$ that defined on the vertices of $\mathcal{G}$, we can obtain the convolution in graph spectral domain as follows:

then by transforming it back into spatial domain, we get

# Spectral Based GNNs

To be continued …