A Markov process is a stochastic process where the future outcomes of the process can be predicted conditional on only the present state. This means that knowledge of past events have no bearing whatsoever on the future. This is also called the Markov property. Someone taking a multiple choice test could be thought of as a Markov process. Assume that this person makes random guesses on all his questions and has therefore a 1/5 probability of getting each question correct. At any point of the test, as long as you know the current score of the student, you can make a prediction of the future score after 1 question, 3 questions, or at the end of the test. Having knowledge of what guesses were made beforehand will not benefit your prediction. That is to say knowing that the student achieved his current score by guessing all A's or guessing all C's will make no difference.

An example of a non-Markov process can be outlined as follows. Imagine that you have 10 cards labelled 1-10. You draw cards out at random. Your first four cards happen to be 1, 2, 3, and 8. Your current sum is 14. Drawing your next card you know that the sum must be at least 18 or greater hereafter because the lowest card left in the deck is 4. But, if you had no knowledge of the past and only knowledge of the present state of having a sum of 14, you could not come to the same conclusion. If you only knew the sum, it could be possible that the four cards you have drawn were 2, 3, 4, and 5. This means there is a possibility you draw the 1 next and get a sum of 15. Here, knowledge of the history of the process changes your prediction of the future. Therefore it is not a Markov process.