Structure of Hidden Markov Model
Hidden Markov Model
A Hidden Markov Model (HMM) is a statistical model for sequence data where an underlying, unobserved process (hidden states) generates observable observations. The model connects hidden states and observations via two sets of probabilities: state transition probabilities and emission probabilities.

Components of Hidden Markov Model
Hidden states: Unobserved categories that evolve over time according to a Markov process. The next state depends only on the current state, not on past states.
Observations: The data we actually observe, which are generated probabilistically from the current hidden state.
Initial state distribution: The probability of starting in each hidden state.
Transition (or state) probabilities: A matrix A where A[i, j] is the probability of moving from hidden state i to hidden state j at the next time step.
Emission (or observation) probabilities: A matrix or set B describing the probability of observing a particular symbol (or value) given the current hidden state. Emissions can be discrete (categorical) or continuous (often modeled with distributions like Gaussian’s for each state).
Time indices: The model progresses in discrete time steps t = 1, 2, …, T, with a sequence of hidden states X1, X2, …, XT and observations O1, O2, …, OT.
Common ways of Hidden Markov Model
Number of hidden states: N. Each Xt ∈ {1, …, N}.
Observation space: M possible observation symbols (for discrete emissions) or a continuous space (for Gaussian emissions, etc.).
Parameters to learn: A (transition matrix), B (emission parameters), and π (initial state distribution).
Typical tasks of HMM
Evaluation: Given a model and a sequence of observations, compute the probability of the sequence.
Decoding: Find the most likely sequence of hidden states that produced the observations (Viterbi algorithm).
Learning: Given a sequence of observations (and possibly known states), estimate the model parameters (Baum-Welch / Expectation-Maximization).
Common variants of HMM
Discrete HMM: Both hidden states and observations are discrete.
Gaussian HMM: Hidden states are discrete, observations are continuous with Gaussian emissions per state.
Left-to-right HMMs: Often used in speech recognition to enforce a progression through states.