Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Hidden Markov Model predicting next observation

I have a sequence of 500 observations of the movements of a bird. I want to predict what the 501st movement of the bird would be. I searched the web and I guess this can be done by using HMM, however I do not have any experience on that subject. Can anyone explain the steps of an algorithm used to solve this problem?

like image 396
user975733 Avatar asked Oct 02 '11 19:10

user975733


People also ask

What is observation in hidden Markov model?

Observation refers to the data we know and can observe. Markov process is shown by the interaction between “Rainy” and “Sunny” in the below diagram and each of these are HIDDEN STATES. OBSERVATIONS are known data and refers to “Walk”, “Shop”, and “Clean” in the above diagram.

What are the three fundamental problems that characterize a hidden Markov model?

HMM provides solution of three problems : evaluation, decoding and learning to find most likelihood classification.

What are the two assumptions of the hidden Markov models?

Under the Markov assumption and the independence assumption, an HMM only needs two set of parameters to model P(st|st−1) and P(ot|st). We call these, respectively, the transition probabilities and the emission probabilities.


1 Answers

x1-x2-x3-x4-x5......x500-x501
|  |  |  |  |       |
y1 y2 y3 y4 y5      y500

x - actual state
y - observations

P(y_i|x_i) - how you think the observation depends on the actual state
P(x_i|x_(i-1)) - how you think the actual state evolves

for i = 1,2,3...,501:
    write down best-guess of x_i based on y_i* and x_(i-1)**
you have your solution, since you only care about the last state

* missing in step 1
** missing in step 501

The above is known as the forward-backward algorithm ( http://en.wikipedia.org/wiki/Forward-backward_algorithm ) and is a special case of the sum-product algorithm (on Bayesian network trees and Markov network trees) on this particular kind of tree (a simple chain with nodes hanging off). You can ignore the "backwards" step because you don't need it, since you only care about the last state.

If the transition probabilities in your HMM are unknown, you must either:

  • perform a learning algorithm, such as EM (known as Baum-Welch when performed on HMMs)
  • take a naive guess based on domain knowledge (e.g. if your hidden states is DNA you can count the frequencies of transition events given the previous state by manually labeling the transitions on DNA data and computing the frequencies)
like image 51
ninjagecko Avatar answered Sep 19 '22 17:09

ninjagecko