Stochastic process
Introduction
When you have a function
0 | 1 |
1 | 2 |
2 | 5 |
you can think of the "discrete derivative" |
0 | 1 | - |
1 | 2 | 1 |
2 | 5 | 3 |
The usual derivative is no a "real" thing: it emerges when you take increasingly smaller steps |
||
The problem of recover |
Formal definition
A stochastic process is a collection of random variables
is the sample space (set of all possible outcomes), is a -algebra (set of possible events), is a probability measure, is an index set (often representing time, e.g., or ), - For each
, is a random variable mapping outcomes to states in a measurable space (e.g., ).
A simple example
Let’s explore a basic stochastic process using a random walk.
You start at position
- Heads (H): Move +1 (right).
- Tails (T): Move −1 (left).
Your position at timeis a random variable , forming a sequence .
The set, a simple symmetric random walk, is a fundamental stochastic process where each depends on past coin flips.
Sample Space
The Random Variable
At each step, you move +1 (if Heads) or −1 (if Tails). Starting at
: : : :
Thus,
Let
Another example
Now imagine shrinking the step size
The result — in the limit — is the Brownian motion
Important property
Any function of a stochastic process is itself a stochastic process.
More formally, if
is also a stochastic process.
This is because:
- For each fixed time
, is a random variable (since it's a function of the random variable ) - The collection
indexed by time forms a new stochastic process