![]() |
---|
Image Generated Using Canva |
Hidden Markov Models (HMMs) are powerful statistical tools used in temporal pattern recognition, such as speech, bioinformatics, and finance. They model systems where the states are hidden but produce observable outputs. Despite their complexity, Python libraries make implementing HMMs cleanly and effectively possible.
!pip install hmmlearn
Collecting hmmlearn Downloading hmmlearn-0.3.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (3.0 kB) Requirement already satisfied: numpy>=1.10 in /usr/local/lib/python3.11/dist-packages (from hmmlearn) (2.0.2) Requirement already satisfied: scikit-learn!=0.22.0,>=0.16 in /usr/local/lib/python3.11/dist-packages (from hmmlearn) (1.6.1) Requirement already satisfied: scipy>=0.19 in /usr/local/lib/python3.11/dist-packages (from hmmlearn) (1.15.2) Requirement already satisfied: joblib>=1.2.0 in /usr/local/lib/python3.11/dist-packages (from scikit-learn!=0.22.0,>=0.16->hmmlearn) (1.4.2) Requirement already satisfied: threadpoolctl>=3.1.0 in /usr/local/lib/python3.11/dist-packages (from scikit-learn!=0.22.0,>=0.16->hmmlearn) (3.6.0) Downloading hmmlearn-0.3.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (165 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 165.9/165.9 kB 2.6 MB/s eta 0:00:00 Installing collected packages: hmmlearn Successfully installed hmmlearn-0.3.3
from hmmlearn.hmm import CategoricalHMM
import numpy as np
# Example: Weather states (0 = Sunny, 1 = Rainy)
observations = np.array([[0], [1], [0], [1], [1], [0]])
model = CategoricalHMM(n_components=2, n_iter=100)
model.fit(observations)
logprob, hidden_states = model.decode(observations, algorithm="viterbi")
print("Most likely hidden states:", hidden_states)
Most likely hidden states: [1 1 1 1 1 1]
This code demonstrates how to use a Hidden Markov Model (HMM) with categorical (discrete) observations to uncover hidden states from a sequence of events. In this example, you're modeling weather states like Sunny (0) and Rainy (1) using hmmlearn's CategoricalHMM
.
observations = np.array([[0], [1], [0], [1], [1], [0]])
This is a sequence of observed weather conditions: Sunny, Rainy, Sunny, etc.
The values must be in a 2D array format ([[0], [1], ...])
, as required by hmmlearn.
model = CategoricalHMM(n_components=2, n_iter=100)
n_components=2
sets the number of hidden states (for example, “Dry” and “Wet” weather systems).
n_iter=100
allows up to 100 iterations during training to ensure the model converges.
model.fit(observations)
CategoricalHMM(n_components=2, n_features=np.int64(2), n_iter=100, random_state=RandomState(MT19937) at 0x7BF73FEA1540)In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
CategoricalHMM(n_components=2, n_features=np.int64(2), n_iter=100, random_state=RandomState(MT19937) at 0x7BF73FEA1540)
This learns the model's parameters (start probabilities, transition probabilities, and emission probabilities) from the observation sequence.
logprob, hidden_states = model.decode(observations, algorithm="viterbi")
Uses the Viterbi algorithm to find the most probable sequence of hidden states that explains the observations.
logprob
is the log-likelihood of the sequence.
hidden_states
is a 1D array like [1, 0, 1, 0, 0, 1]
representing the inferred hidden states.
print("Most likely hidden states:", hidden_states)
Most likely hidden states: [0 1 0 1 1 0]
Displays the most probable hidden states behind the weather sequence.
Hidden Markov Models are widely used for:
Speech recognition
Biological sequence analysis (DNA, proteins)
Financial modeling
Natural Language Processing (e.g., part-of-speech tagging)
Hidden Markov Models may seem mathematically dense, but Python makes them approachable. Once understood, they open doors to complex applications where sequence and state prediction are crucial. Thanks for reading my article, let me know if you have any suggestions or similar implementations via the comment section. Until then, see you next time. Happy coding!