Characteristic Functions#

Part of a series: Stochastic fundamentals.

Follow reading here

Definition#

Let \(X: \Omega \mapsto \mathbb{R}\) be a real random variable. The characteristic function of \(X\) is defined as

\[\begin{split} &\varphi_X : \mathbb{R} \rightarrow \mathbb{C}\\ &\varphi_X (t) = E\left( e^{i t x}\right) \end{split}\]

If \(X\) has PDF \(f_X\), then \(\varphi_X\) is given by the Fourier transform

\[ \varphi_X (t) = \int_{-\infty}^\infty f_X(x) e^{i t x} dx \]

Properties#

  1. \(\varphi_X\) always exists and is uniformly continuous on \(\mathbb{R}\).
    (Existence follows from the fact that \(e^{i t x}\) is a bounded continuous function and the measure \(E( \dot )\) is finite. Roughly speaking \(E(1) = 1\) implies \(E(g (x)) < \infty\) if g is bounded and continuous

  2. \(\varphi_X (0) = E(1) = 1\)

  3. \(|\varphi_X (t)| \leq 1 \quad \forall t \in \mathbb{R}\)

  4. \(\varphi_X (-t) = \varphi_X^\star (t)\)

    Hence the characteristic function of a symmetric real random variable is real-valued and even.

Examples#

\[\begin{split} X &\sim \text{Binomial}\, (n,p) & \varphi_X(t) &= \left(1-p+p\,e^{it}\right)^n \\ X &\sim \text{Poi}\, (\lambda) & \varphi_X(t) &= e^{\lambda \left( e^{i t} - 1 \right)} \\ X &\sim \mathcal{N}\, (\mu,\sigma^2) & \varphi_X(t) &= e^{it \mu - \frac 1 2 \sigma^2 t^2} \\ X &\sim \text{Cauchy: } f_X(x) = \frac{1}{\pi \gamma} \frac{\gamma^2}{(x-x_0)^2 + \gamma^2} & \varphi_X(t) &= e^{i t x_0 - \gamma |t|} \\ \end{split}\]

Why Characteristic functions?#

Because they are so practical!

Lemma (1) If \(X\) has moments up to order \(k\), then the characteristic functions is \(k\) times continuously differentiable on \(\mathbb{R}\) and

Lemma (2) If \(X_1 \dots X_n\) are independent random variables, then the sum \(\begin{aligned} S = X_1 + X_2 + \dots + X_n\end{aligned}\) has the characteristic function \(\begin{aligned} \varphi_S (t) = \varphi_{X_1} (t) \varphi_{X_2} (t) \dots \varphi_{X_n} (t)\end{aligned}\)

Proof

\[\begin{split} \varphi_s (t) &= E \left[ e^{i t s} \right]\\ &= E \left[ \exp \left( i t \sum_{j=1}^n x_j \right) \right]\\ &= E \left[ \prod_{j=1}^n e^{i t x_j} \right]\\ &= \prod_{j=1}^n E \left[ e^{i t x_j} \right]\\ &= \prod_{j=1}^n \varphi_{X_j} (t) \end{split}\]

Cumulants#

According to Lemma (1), we know that (assuming everything exists)

\[ E \left[ x^n \right] = (-i)^{\color{red}n} \varphi_X^{(n)} (0) \,. \]

Thus, writing the characteristic function as a Taylor series

\[\begin{split} \varphi_X (t) &= E \left[ e^{i t x} \right]\\ &= E \left[ \sum_{n = 0}^\infty \frac{1}{n!} (i t)^n x^n\right]\\ &= \sum_{n = 0}^\infty \frac{(i t)^n}{n!} E \left[ x^n\right] \,.\\ \end{split}\]

Hence, the characteristic function “generates” the moments (or rather \(i^k E[x^k]\)).

Similarly, we can expand \(\log \varphi_X(t)\) in a Taylor series: \(\begin{aligned} \log \left(\varphi_X(t)\right) = \sum_{n=0}^\infty \kappa_n \frac{(i t)^n}{n!} \end{aligned}\)
The expansion coefficients \(\kappa_n\) are called the cumulants.

Why cumulants?#

Because they are often easier to handle than moments. Consider the independent random variables \(X\),\(Y\) and set \(S = X + Y\). Then \(\begin{aligned} \varphi_S (t) &= \varphi_X (t) \varphi_Y (t)\\ \Rightarrow~ \log\left( \varphi_S (t) \right) &= \log\left( \varphi_X (t) \right) + \log\left( \varphi_Y (t) \right)\\ \Rightarrow~ \kappa_n (S) &= \kappa_n (X) + \kappa_n (Y) \end{aligned}\)\

Cumulants and moments are related by the Bell polynomials.

Authors#

Philipp Böttcher, Dirk Witthaut