Stochastic process
See stochastic calculus.
Formal definition
A stochastic process is a collection of random variables
is the sample space (set of all possible outcomes), is a -algebra (set of possible events), is a probability measure, is an index set (often representing time, e.g., or ).
A simple example: random walk
Let’s explore a basic stochastic process using a random walk (it is an example of Markov process and Markov chain).
You start at position
- Heads (H): Move +1 (right).
- Tails (T): Move −1 (left).
Your position at timeis a random variable , forming a sequence .
The set, a simple symmetric random walk, is a fundamental stochastic process where each random variable depends on past coin flips. Who is who in this example?
Sample Space
The Random Variable
At each step, you move +1 (if Heads) or −1 (if Tails). Starting at
: : : :
Thus,
Let
Another example
Now imagine shrinking the step size
The result — in the limit — is the Brownian motion
Important property
Any function of a stochastic process is itself a stochastic process.
More formally, if
is also a stochastic process.
This is because:
- For each fixed time
, is a random variable (since it's a function of the random variable ) - The collection
indexed by time forms a new stochastic process
Curve-valued random variables
Given a stochastic process
Therefore, we can consider
We can endow this set with a