Markov Chains

Future state depends on the current state, not the past states

Terminology

  • State space: The set of all possible states that the system can be in.
  • Transition probability: The probability of moving from one state to another in a single time step.
  • Transition matrix: A matrix that contains the transition probabilities between all pairs of states.
  • Stationary distribution: A probability distribution that remains unchanged as the system evolves over time.