Notes on Stochastic Calculus

Here I put some notes on the stochastic calculus for my future use. Those are some important (but also basic) properties and proofs about topics like moment generating function, martigales and Brownian motion etc. that we should know.

MGF of Normal Distribution

For $X \sim N(\mu, \sigma^2)$, we have MGF($\theta$) = exp($\theta\mu + \theta^2\sigma^2 / 2)$. We have E($X^k$) = $MGF^{(k)}(0)$.

Truncated Normal Distribution

A two-sided truncation $(a, b)$ on $N(\mu, \sigma^2)$, then: $$E[X | a < X < b] = \mu - \sigma \frac{\phi(\alpha) - \phi(\beta)}{\Phi(\alpha) - \Phi(\beta)}$$ where $\alpha: (a - \mu) / \sigma $ and $\beta: (b - \mu)/\sigma$

Doob’s Indentity

Let X be a MG and $T$ a stopping time, then E$X_{T \wedge n} = EX_0$ for any n.

Martingale Transform

Define $(Z \cdot X)$ := $\Sigma_{i=1}^{n} Z_i(X_i - X_{i-1})$ where X is MG with $X_0$ = 0 and $Z_n$ is predictable and bounded, then $(Z \cdot X)$ is MG.

If X is sub-MG (conditional expectation of future values given all prior values is large than or equal to its current value), then also is $(Z \cdot X)$.

If Z $\in$ [0, 1], then $E(Z \cdot X) \leq EX$.

Common Martigales and the Proofs

  • $S_n:= \Sigma_{i=1}^{n} X_i$ for $X \sim (0, \sigma^2)$. (Symmetric Random Walk)

  • $S_n^2 - n\sigma^2$ for symmetric random walk $S_n$.

Proof: $$ \begin{align} E[S_{n+1}^2 - (n+1)] &= E[S_{n+1}^2] - E[(n+1)\sigma^2]\\
&= \frac{1}{2}(S_n + \sigma)^2 + \frac{1}{2}(S_n - \sigma)^2 - \sigma^2E(n+1)\\
&= S_n^2 + \sigma^2 - \sigma^2(n+1)\\
&= S_n^2 - n\sigma^2 \end{align} $$

  • $(q/p)^{S_n}$ for assymetric random walk $S_n (P(X = 1) = p, P(X = -1) = q = 1 - p)$

Proof: $$ \begin{align} E[(q/p)^{S_{n+1}}] &= p \cdot (q/p)^{S_{n}+1} + q \cdot (q/p)^{S_{n}-1}\\
&= (p + q) \cdot (q/p)^{S_{n}}\\
&= (q/p)^{S_{n}} \end{align} $$

  • $S_n - n(p - q)$ for assymetric random walk $S_n$.

Proof: $$ \begin{align} E[S_{n+1} - (n+1)(p-q)] &= E[S_{n+1}] - (n+1)(p-q)\\
&= p \cdot (S_n + 1) + q \cdot (S_n - 1) - (n+1)(p-q)\\
&= (p+q)S_n - n(p-q) \end{align} $$

  • $B_t^2 - t$ for standard Brownian motion $B_t$.

Proof:

Since $B_t \sim N(0, t)$, $E[B_t] = 0, Var(B_t) = t, E[B_t^2] = E[B_t]^2 + Var(B_t) = t$.

Hence, $E[B_t^2 - t]$ = t - t = 0

Gambler’s Ruin

  • Symmetric: for (discrete- or continuous-time) MG $S$ with $S_0=0$, define stopping time $T$ = min{$\tau(-A), \tau(B)$}, then P($S_T = B$) = $\frac{A}{A + B}$, P($\tau(B) < \infty$) = $lim_{A\rightarrow \infty}P(S_T = B)$ = 1 and E$T = AB$.

Brownian Motion

A continuous stochastic process $W(t)$, $t \geq 0$, is a Brownian motion if:

  • $W(0) = 0$
  • The increments of the process $W(t_1) - W(t_0), W(t_2) - W(t_1),…,W(t_n) - W(t_{n-1}), \forall 0 \leq t_1 \leq t_2 \leq … \leq t_n$ are independent
  • Each of theses increments is normally distributed with distribution $W(t_{i+1}) - W(t_i) \sim N(0, t_{i+1} - t_i)$

Some properties of Brownian motion:

  • E[$W(t)$] = 0 ( Since $W(t) - W(0) \sim N(0, t)$)
  • E[$W(t)^2$] = $t$ ( Since $E(W(t)^2) = Var(W(t)) + E(W(t))^2$ = $t + 0$ = $t$)
  • Martingale property: E[$W(t + s) | W(t)$] = $W(t)$
  • $Cov(W(s), W(t)) = s, \forall 0 < s < t$ ( Since $W(t) = W(s) + W(t) - W(s)$, $cov(W(s), W(t)) = cov(W(s), (W(t) - W(s)) + W(s)) = cov(W(s), W(t) - W(s)) + cov(W(s), W(s))$. Because the increment of Brownian motion is independent, cov(W(s), W(t) - W(s)) = 0. So, cov(W(s), W(t)) = cov(W(s), W(s)) = var(W(s)) = s.)

Some Brownian motion related martingales:

  • $Y(t) = W(t)^2 - t$
  • $Z(t)$ = exp($\lambda W(t) - \frac{1}{2} \lambda^2 t$), where $\lambda$ is any constant and $W(t)$ is a Brownian motion. (exponential martingale)

Reflection Principle

  • Version 1: For Brownian motion $B$ and stopping time $T = \tau(a)$, define $B^{\star}$ such that $B_{t}^{\star} = B_t$ for all $t \leq T$ and $B_{t}^{\star} = 2a - B_t$ for all t > T, then $B^{\star}$ is also a Brownian motion.

  • Version 2: Consider a random walk starting at $a$, $S_0 = a$, and reaching $b$ in $n$ steps: $S_n = b$. Denote $N_n(a, b)$ as the number of possible paths from (0, a) to (n, b) and $N_{n}^0(a, b)$ as the number possible paths from $(0, a)$ to $(n, b)$ that at some step $k$ ($k$ > 0), $S_k = 0$; in other words, $N_{n}^{0}(a, b)$ are the paths that contain ($k$, 0), $\exists 0 < k < n$. The reflection principle syas that if $a$, $b$ > 0, then $N_{n}^{0} = N_n(-a, b)$.

First Passage Time $T$:= $\tau(a)$

  • CDF: $P(T\leq t) = 2P(B_t > a) = 2\Phi(-a/\sqrt t).$
  • PDF: take the derivative of CDF
  • $ET: X_t:= exp(\theta B_t - \theta^2t/2)$ is a MG, we know $E(X_T) = X_0 = 1,$ from which expectation is calculated.

Ito’s Formula - $u(t, X_t)$

Let $X_t$ be an Ito process. Let $u(t,x)$ be a twice-continuously differentiable function with $u$ and its partial derivatives bounded, then, $$ dX_t = \frac{\partial u}{\partial t}(t, X_t) + \frac{\partial u}{\partial x}(t, X_t)dX_t + \frac{1}{2}\frac{\partial^2 u}{\partial x^2}(t, X_t)d[X, X]_t. $$