Using Hidden Markov Models

For Unsupervised Learning

For Unsupervised Learning

Agoston Torok

@torokagoston

data scientist @SynetiqLab

satRday conference

Budapest, 09/03/2016

slides: https://github.com/Synetiq/satRdays_talk

Budapest, 09/03/2016

slides: https://github.com/Synetiq/satRdays_talk

@torokagoston

We hypothesize latent states/regimes in the time series (for example past flow and tree rings [1])

\[ \begin{array} & X_{1}\overset{P_T}{\longrightarrow} & X_{2}\overset{P_T}{\longrightarrow} & \dots \overset{P_T}{\longrightarrow} & X_N\\ \downarrow P_O & \downarrow P_O & & \downarrow P_O \\ O_1 & O_2 & \dots & O_N\end{array} \]

- We cannot directly observe the \( X \) time series, but we can infer it
- We have a series of \( O \) observations resulting from \( X \) latent states
- Based on the \( P_T \) transition probability matrix, and \( P_O \) probability matrix
- Passes between \( O \), \( X \), and \( Model \)

@torokagoston

- Results show that emotions have different physiological correlates [7]
- It is reasonable to say that the transition probabilities are not uniform
- Some emotions lasts longer (fear), others fade rather fast (surprise)

“Emotions are just as easy to access as any Higgs boson.”

@torokagoston