Hidden Markov Models: The Viterbi Algorithm

Given an output sequence and a Hidden Markov Model, the Viterbi algorithm determines the most likely sequence of hidden states that the given model would travel through to generate the observation sequence. It does so by building a matrix of probabilities starting with the initial probabilities that each state would output each observation. The likelihood that a state emits the next output symbol is then equal to the emission probability of that state, output symbol pair times the most likely sequence to that state.

To implement this algorithm, I'll use the example of a dice roller who claims to have fair die. Given a sequence of observations we will consider a few HMMs to see the probability that the sequence we observe was generated by fair die vs loaded die.


Recent Updates

  • 10-27-2016 Independent Set Puzzles
  • 06-28-2016 Lets Learn About XOR Encryption
  • 06-15-2016 Discrete-time Markov Chains
  • 03-01-2016 Topological Sort
  • 01-21-2016 The RSA Algorithm
  • 11-20-2015 How To Take Notes in Math Class
  • 10-28-2015 The Depth-First-Search Algorithm
  • 10-28-2015 The Breadth-First-Search Algorithm
  • 09-23-2015 ID3 Algorithm Decision Trees
  • 07-08-2015 Clique Problem Puzzles
  • 06-25-2015 Unidirectional TSP Puzzles
  • 04-04-2015 Learn About Descriptive Statistics
  • 02-19-2015 Slope Formula
  • 01-15-2015 Interactive Midpoint Formula
  • 12-18-2014 Triangle Sum Puzzle
  • 12-02-2014 The Bridge Crossing Problem
  • 11-26-2014 Magical Squares Game
  • 11-07-2014 QR Decomposition
  • 09-12-2014 The A* Algorithm
  • 08-06-2014 Naive Bayesian Classification