If you are a data scientist or have read few articles about data science, I am sure you probably heard the term 'Markov' somewhere. This post is to help you understand the basic concept of the algorithm in a layman's term .

## Who is Markov and why such a big deal?

I think the following video describes how and why Markov theory came about. Please watch here from * Origin of Markov Chains *by Khan Academy.

## Different types of Markov models

Markov chain: Used by systems that are autonomous and have fully observable states

Hidden Markov model: Used by systems that are autonomous where the state is partially observable

Markov decision processes: Used by controlled systems with a fully observable state

Partially observed Markov decision processes: Used by controlled systems where the state is partially observable

## What is hidden?

__Original blog here__. With HMMs, we don't know which state matches which physical events, but instead, each state matches a given output. We observe the output over time to determine the sequence of states.

Example: If you are staying indoors you will be dressed up a certain way. Lets say you want to step outside. Depending on the weather, your clothing will change. Over time, you will observe the weather and make better judgements on what to wear if you get familiar with the area/climate. In an HMM, we observe the outputs over time to determine the sequence based on how likely they were to produce that output.

Let us consider the situation where you have no view of the outside world when you are in a building. The only way for you to know if it is raining outside it so see someone carrying an umbrella when they come in. Here, the evidence variable is the *Umbrella*, while the hidden variable is *Rain*. See the probabilities in the diagram above.

Since this is a Markov model, *R(t)* depends only on *R(t-1). *A number of related tasks ask about the probability of one or more of the latent variables, given the model’s parameters and a sequence of observations which is sequence of *umbrella* observations in our scenario.

HMMs allow us to model processes with a hidden state, based on observable parameters. The main problems solved with HMMs include determining how likely it is that a set of observations came from a particular model, and determining the most likely sequence of hidden states. They are a valuable tool in temporal pattern recognition. Within the temporal pattern recognition area, the HMMs find application in speech, handwriting and gesture recognition, musical score following and SONAR detection.

Other useful sources:

__What is Hidden in the Hidden Markov Model?__

__From "What is a Markov Model" to "Here is how Markov Models Work"__