Stochastic process

Introduction

When you have a function f, interpreted as a table

x f(x)
0 1
1 2
2 5
you can think of the "discrete derivative" Δf as the table
x f(x) Δf(x)
0 1 -
1 2 1
2 5 3
The usual derivative is no a "real" thing: it emerges when you take increasingly smaller steps Δx for the x column and you consider the ratio Δf(x)/Δx.
The problem of recover f from Δf, or even from df/dx is called integration. So far, so good. But, what if Δf(x)/Δx does not correspond to a fixed list of values but, instead, to a random list?

Formal definition

A stochastic process is a collection of random variables {Xt}tT defined on a common probability space (Ω,F,P), where:

A simple example

Let’s explore a basic stochastic process using a random walk.
You start at position 0. At each step t=1,2,, you flip a fair coin:

Sample Space Ω

Ω contains all infinite sequences of H/T (e.g., (H,T,H,)). For finite t, we restrict to the first t flips. I.e.,

Ω={H,T}N.

The Random Variable P2

At each step, you move +1 (if Heads) or −1 (if Tails). Starting at P0=0, after 2 steps, the possible positions are:

Thus, P2 can take values in {2,0,+2}.
Let ω=(ω1,ω2) be the outcomes of the first two flips (where ωi{H,T}). Then:

P2(ω)={+2if ω=(H,H),0if ω=(H,T) or (T,H),2if ω=(T,T).

Another example

Now imagine shrinking the step size Δt0 and the step size in position ΔP0, in just the right way, so that the resulting process still has randomness but now varies continuously in time.

The result — in the limit — is the Brownian motion Bt.

Important property

Any function of a stochastic process is itself a stochastic process.

More formally, if {X(t)}t0 is a stochastic process and g is a measurable function, then {Y(t)}t0 defined by:

Y(t)=g(t,X(t))

is also a stochastic process.

This is because: