Stochastic Process
Published:
Summary of SEEM 5580, 2024 Fall.
This is my summary of the course SEEM 5580, taught by Professor Xuefeng Gao. The reference book is Stochastic Processes, Sheldon M. Ross (1995), and the problem solutions can be found at http://www.charmpeach.com/?s=stochastic.
Basic Probability Theory
Distribution | MGF | Mean | Variance | |
---|---|---|---|---|
Poisson | \( e^{-\lambda} \frac{\lambda^x}{x!} \) | \( \exp(\lambda (e^t - 1)) \) | \( \lambda \) | \( \lambda \) |
Geometric | \( pq^{n-1} \) | - | \( 1/p \) | \( q/p \) |
Exponential | \( \lambda e^{-\lambda x} \) | \( \frac{\lambda}{\lambda - t} \) | \( 1/\lambda \) | \( 1/\lambda^2 \) |
Normal | \( \frac{1}{\sqrt{2\pi} \sigma} \exp\left(-\frac{(x-\mu)^2}{2\sigma}\right) \) | \( \exp(\mu t + \sigma^2 t^2 / 2) \) | \( \mu \) | \( \sigma^2 \) |
Where the moment generating function is defined as:
\[\psi(t) = E[e^{tX}] = \int e^{tX} dF(x)\]Probability Inequalities - Chernoff Bounds
\[P(X \geq a) \leq e^{-ta} M(t), \forall t > 0; \quad P(X \leq a) \leq e^{-ta} M(t), \forall t < 0\]Poisson Process
- \( N(0) = 0 \)
- \( P\{N(t+s) - N(s) = n\} = e^{-\lambda t} \frac{(\lambda t)^n}{n!}, \quad n = 0, 1, \ldots \)
- Independent increment
Order statistics of uniform distribution: the conditional density of \( S_1, \ldots, S_n \) given \( N(t) = n \) is \( f(t_1, \ldots, t_n) = \frac{n!}{t^n} \).
Let \( N_1(t), N_2(t) \) represent the number of type-1/2 events that occur by time \( t \), then they are independent Poisson random variables having means \( \lambda t p \) and \( \lambda t (1-p) \), where \( p = \frac{1}{t} \int_0^t P(s) ds \).
Nonhomogeneous Poisson Process
Independent, nonstationary increment.
\[m(t) = \int_0^t \lambda(s) ds\] \[P\\{N(t+s) - N(t) = n\\} = \exp\left\{-\int_t^{t+s} \lambda(t') dt'\right\} \frac{\left(\int_t^{t+s} \lambda(t') dt'\right)^n}{n!}, \quad n \geq 0\]Compound Poisson Process
Independent, stationary increment.
Compound Poisson r.v.: \( W = \sum_{i=1}^N X_i \), \( E[W] = \lambda E[X] \), \( \operatorname{Var}(W) = \lambda E[X^2] \), \( \psi_W(\theta) = \exp(\lambda \psi_X(\theta) - \lambda) \).
Definition: \( X(t) := \sum_{i=1}^{N(t)} X_i \) where \( X_i \) are i.i.d. and independent of Poisson process \( N \).
Conditional Poisson Process
Nonindependent stationary increment.
Definition: \( \tilde{N}(t) := N(t \Lambda) \) where \( N \) is a Poisson process with rate 1 and \( \Lambda \) is a positive r.v. with distribution \( G \).
\[P\{\Lambda \leq x \mid \tilde{N}(t) = n\} = \frac{\int_0^x e^{-\lambda t} (\lambda t)^n dG(\lambda)}{\int_0^\infty e^{-\lambda t} (\lambda t)^n dG(\lambda)}\]Discrete Time Markov Chain
\( P(X_{n+1} = j \mid X_n = i, X_{n-1} = i_{n-1}, \ldots, X_0 = i_0) = P(X_{n+1} = j \mid X_n = i) := P_{ij} \).
Chapman-Kolmogorov equations:
\[P_{ij}^{n+m} = \sum_{k=0}^\infty P_{ik}^n P_{kj}^m\]- \( i \leftrightarrow j \): communicate, same class; irreducible: only one class.
- Period: greatest common divisor of \( n \) s.t. \( P_{ii}^n \neq 0 \); aperiodic: \( d(i) = 1 \).
- Recurrent: \( f_{jj} = 1 (\Leftrightarrow \sum_{n=1}^\infty P_{jj}^n = \infty) \)
- Transient: otherwise.
Define the expected number of transitions needed to return to state \( j \):
\[\mu_{jj} = \begin{cases} \infty & \text{if } j \text{ is transient} \\ \sum_{n=1}^\infty n f_{jj}^n & \text{if } j \text{ is recurrent} \end{cases}\](Limiting distribution) If \( i \leftrightarrow j \):
- If \( j \) is aperiodic, then \( \lim_{n \to \infty} P_{ij}^n = \frac{1}{\mu_{jj}} \).
If \( j \) has period \( d \), then \( \lim_{n \to \infty} P_{jj}^{nd} = \frac{d}{\mu_{jj}} \).
- Positive recurrent: \( \mu_{jj} < \infty \)
- Null recurrent: \( \mu_{jj} = \infty \)
Class property: positive recurrent, null recurrent, recurrent, period.
Stationary:
\[\pi_j = \sum_{i=0}^\infty \pi_i P_{ij}, \quad j \geq 0\](Stationary distribution) Irreducible aperiodic Markov chain:
- Either the states are all transient or all null recurrent; in this case, \( P_{ij}^n \to 0 \) and there exists no stationary distribution.
- Or else, all states are positive recurrent (ergodic); in this case, \( \{\pi_j = \lim_{n \to \infty} P_{jj}^n\} \) is a stationary distribution and there exists no other stationary distribution.
Continuous Time Markov Chain (CTMC)
We say \( \{X(t), t \geq 0\} \) is a CTMC if for all \( s, t \geq 0 \):
\[P(X(t+s) = j \mid X(s) = i, X(u) = x(u), 0 \leq u < s) = P(X(t+s) = j \mid X(s) = i)\]The survival time \( \tau_i \) is exponentially distributed.
Construction of CTMC:
- \( v_i \) as the rate of exponential distribution of \( \tau_i \).
- When the process leaves state \( i \), it enters \( j \) with probability \( P_{ij} \), where \( \sum_{j \neq i} P_{ij} = 1 \).
Define:
\[q_{ij} = \begin{cases} v_i P_{ij} & \forall i \neq j \\ -v_i & j = i \end{cases}\]We call \( q_{ij} \) the transition rate from \( i \) to \( j \). The matrix \( Q \) is called the generator matrix.
Let \( P_{ij}(t) = P(X(t+s) = j \mid X(s) = i) \).
A CTMC with states \( 0, 1, \ldots \) for which \( q_{ij} = 0 \) whenever \( \|i - j\| > 1 \) is called a birth and death process. Let \( \lambda_i = q_{i,i+1} \), \( \mu_i = q_{i,i-1} \) be the birth rates and death rates, respectively. Then \( v_i = \lambda_i + \mu_i \), \( P_{i,i+1} = \frac{\lambda_i}{\lambda_i + \mu_i} \).
Kolmogorov’s Backward Equations
\[P'(t) = Q P(t)\]Kolmogorov’s Forward Equations
Under suitable regularity conditions:
\[P'(t) = P(t) Q\]\( \pi = (\pi_i)_{i \in S} \) is a stationary distribution for the CTMC if:
\[0 = \pi Q, \quad \pi_i \geq 0, \quad \sum_{i \in S} \pi_i = 1\]M/M/1 Queue
\( \pi_n = \rho^n \pi_0 \). Define \( \rho = \lambda / \mu \), name it utilization or traffic intensity.
Consider an irreducible CTMC. If a stationary distribution \( \pi \) exists, then it is unique and \( \lim_{t \to \infty} P_{ij}(t) = \pi_j \) for all \( i \in S \).
Martingale
A stochastic process \( \{Z_n, n = 0, 1, \ldots\} \) is a martingale if \( \mathbb{E}[\|Z_n\|] < \infty \) and \( \mathbb{E}[Z_{n+1} \mid Z_1, \ldots, Z_n] = Z_n \) for all \( n \).
Stopping Time
The positive integer-valued random variable \( N \) is a random time for the process \( \{Z_n, n \geq 1\} \) if the event \( \{N = n\} \) is determined by \( Z_1, \ldots, Z_n \). If \( P(N < \infty) = 1 \), then \( N \) is a stopping time.
Doob’s Stopping Theorem / Sampling Theorem
If either:
- \( \bar{Z}_n \) are uniformly bounded, or
- \( N \) is bounded, or
- \( \mathbb{E}[N] < \infty \), and there is an \( M < \infty \) such that:
then:
\[E[Z_N] = E[Z_1]\]Wald’s Equation
If \( X_i \) are i.i.d. with \( E[\|X\|] < \infty \) and \( N \) is a stopping time for \( X_i \) with \( E[N] < \infty \), then:
\[E\left[\sum_{i=1}^N X_i\right] = E[N] E[X]\]Martingale Convergence Theorem
If \( \{Z_n, n \geq 1\} \) is a martingale such that for some \( M < \infty \):
\[E[\left|Z_n\right|] \leq M, \forall n\]then, with probability 1, \( \lim_{n \to \infty} Z_n \) exists and is finite.
How to Compute a Value Related to Some Event
- Identify the underlying random variables and identify the event as a stopping time.
- Define \( \{Y_n, n = 1, 2, \ldots\} \) with a known value at the stopping time.
- Check whether \( \{Y_n, n = 1, 2, \ldots\} \) is a martingale.
- Apply Doob’s stopping theorem.
Brownian Motions
- \( X(0) = 0 \)
- Stationary and independent increment
- \( X(t) \sim N(0, c^2 t) \)
\( T_a := \inf\{t : X(t) \geq a\} \).
\[P(T_a \leq t) = 2 P(X(t) \geq a)\]\( \{T_a \leq t\} = \{\max_{0 \leq s \leq t} X(s) \geq a\} \).