Characteristic Functions#
Part of a series: Stochastic fundamentals.
Follow reading here
Definition#
Let \(X: \Omega \mapsto \mathbb{R}\) be a real random variable. The characteristic function of \(X\) is defined as
If \(X\) has PDF \(f_X\), then \(\varphi_X\) is given by the Fourier transform
Properties#
\(\varphi_X\) always exists and is uniformly continuous on \(\mathbb{R}\).
(Existence follows from the fact that \(e^{i t x}\) is a bounded continuous function and the measure \(E( \dot )\) is finite. Roughly speaking \(E(1) = 1\) implies \(E(g (x)) < \infty\) if g is bounded and continuous\(\varphi_X (0) = E(1) = 1\)
\(|\varphi_X (t)| \leq 1 \quad \forall t \in \mathbb{R}\)
\(\varphi_X (-t) = \varphi_X^\star (t)\)
Hence the characteristic function of a symmetric real random variable is real-valued and even.
Examples#
Why Characteristic functions?#
Because they are so practical!
Lemma (1) If \(X\) has moments up to order \(k\), then the characteristic functions is \(k\) times continuously differentiable on \(\mathbb{R}\) and
Lemma (2) If \(X_1 \dots X_n\) are independent random variables, then the sum \(\begin{aligned} S = X_1 + X_2 + \dots + X_n\end{aligned}\) has the characteristic function \(\begin{aligned} \varphi_S (t) = \varphi_{X_1} (t) \varphi_{X_2} (t) \dots \varphi_{X_n} (t)\end{aligned}\)
Proof
Cumulants#
According to Lemma (1), we know that (assuming everything exists)
Thus, writing the characteristic function as a Taylor series
Hence, the characteristic function “generates” the moments (or rather \(i^k E[x^k]\)).
Similarly, we can expand \(\log \varphi_X(t)\) in a Taylor series:
\(\begin{aligned}
\log \left(\varphi_X(t)\right) = \sum_{n=0}^\infty \kappa_n \frac{(i t)^n}{n!}
\end{aligned}\)
The expansion coefficients \(\kappa_n\) are called the cumulants.
Why cumulants?#
Because they are often easier to handle than moments. Consider the independent random variables \(X\),\(Y\) and set \(S = X + Y\). Then \(\begin{aligned} \varphi_S (t) &= \varphi_X (t) \varphi_Y (t)\\ \Rightarrow~ \log\left( \varphi_S (t) \right) &= \log\left( \varphi_X (t) \right) + \log\left( \varphi_Y (t) \right)\\ \Rightarrow~ \kappa_n (S) &= \kappa_n (X) + \kappa_n (Y) \end{aligned}\)\
Cumulants and moments are related by the Bell polynomials.