# Stochastic Processes#

Part of a series: Stochastic fundamentals.

Follow reading here

## Definition#

Let \((\Omega, \Sigma, P)\) be a probability space and \(T\) and index set. A real stochastic process is a family of random variables, i.e., a mapping

\(\begin{aligned} X: \Omega \times T &\longrightarrow \mathbb{R}\\ (\omega, t) &\longrightarrow X_t(\omega) \end{aligned}\)

## Characterisation and Remarks#

The index \(t\) is commonly interpreted as time, such that \(X_t\) represents a stochastic time evolution. We typically restrict ourselves to two cases: \(T = \mathbb{N}_0\): time- discrete stochastic process

\(T = \mathbb{R}_+\): time- continuous stochastic processAs in the case of random variables we distinguish between:

__value-discrete__stochastic process: \(X_t\) assumes values in a discrete set (such as \(\mathbb{N}\) or \(\mathbb{Z}\)) only__value-continuous__stochastic process: \(X_t\) assumes values in a continuous set, typically \(\mathbb{R}\)

We characterise whether the moments are finite. A stochastic process is called \(\begin{aligned} \text{integrable if} \; E(|X_t|) &< \infty &\forall t \in T \\ \text{square integrable if} \; E(|X_t|^2) &< \infty & - " - \\ \text{centered if} \; E(|X_t|) &= 0 & - " - \end{aligned}\)

For the physical interpretation as a stochastic time evolution we define furthermore a “sample path” (also called “sample function”, “trajectory”, …) as the mapping \(\begin{aligned} X(\cdot, \omega): T &\longrightarrow \mathbb{R} \end{aligned}\) for every \(\omega \in \Omega\).

We can then further characterize a stochastic process by the properties of the sample path, e.g., whether they are coninuous, differentiable, …It is often helpful to look at the increments

\(\begin{aligned}
Z_{t_1, t_0} &= X_{t_1} - X_{t_0} & \text{for} \; t_1 > t_0 \in T
\end{aligned}\)

We can then distinguish whether the Z are independent or equally distributed, …

A technically very important property is whether the stochastic process has a long-range memory or not. We will later consider this in detail and focus on processes without long-range memory (“Markov processes”).

**Def.: Martingale**
Let \((X_t)_{t \in \mathbb{N}_0}\) be an integrable time-discrete
stochastic process. This process is called a *Martingale* if the
conditional expectation value satisfies
\(E(X_{t+1} | X_0, X_1, \dots X_t) = X_t \; \forall t\). Due to the
linearity of the expectation value this implies that he expected value
of gains/losses is zero. We say that the gain is fair. Today,
Martingales are heavily used in math. economy. In addition to being
fair, they have convenient convergence properties.

## Example: Discrete Random Walk in one Dimension#

Consider a random walker starting at \(x_0=0\). In each discrete time step the walker goes one step to the right or left with probability 1/2. Mathematical formulation:

Value-discrete, time-discrete stochastic process \((X_t)_{t \in \mathbb{N}_0}\)

A sample path might look like ??todo insert figure

We can conveniently describe the stochastic process via the increments \(\begin{aligned} Z_t &:= X_t - X_{t-1} & t=1,2,3,... \end{aligned}\) The \(Z_t\) are iid random variables with

\(\begin{aligned} Z_t &= \left\{ \begin{array}{ll} +1 & \text{with prob. 1/2} \\ -1 & \text{--- " ---} \end{array}\right. \end{aligned}\)\( Then \)\(\begin{aligned} X_t &= X_0 + \sum\limits_{s=1}^{t} Z_s \end{aligned}\)

such that

\(\begin{aligned} E(X_t) &= E(X_0) + \sum\limits_{s=1}^{t} E(Z_s) = 0\\ E(X_t^2) &= \sum\limits_{s,r=1}^{t} E(Z_s Z_r) = \sum\limits_{s=1}^{t} E(Z_s^2) = t \end{aligned}\)

Hence the random value is square-integrable for all finite t.

The random value is a Martingale:

\(\begin{aligned} E(X_{t+1} | X_0 \dots X_t) &= E(X_t + Z_t | X_0, \dots, X_t) \\ = \underbrace{E(X_t | X_0, \dots, X_t)}_{=X_t} &+ \underbrace{E(Z_t | X_0, \dots, X_t)}_{=0}\\ = X_t \end{aligned}\)

For this simple stochastic process we can even write down the entire

*Probability Mass Function*for all \(t\):\(\begin{aligned} P(X_t = 2k -t)=\left\{\begin{array}{ll} \frac{1}{2^t} \left(\begin{array}{c} t\\ k \end{array}\right) & \text{for } k=0,1,\dots,t\\ 0 & \text{else} \end{array}\right. \end{aligned}\)

The random walk is a Markov process since \(X_t\) is defined by \(X_{t-1}\) and \(Z_t\) and there is no further dependence on \(X_{t-2}, X_{t-3}, \dots\) (no long-time memory).