Date: Fri, 20 Mar 2015 07:00:00 +0000
<p style="color: #224422; font-family: 'Lucida Bright', Georgia, serif; font-size: medium;"> This episode introduces the idea of a Markov Chain. A Markov Chain has a set of states describing a particular system, and a probability of moving from one state to another along every valid connected state. Markov Chains are memoryless, meaning they don't rely on a long history of previous observations. The current state of a system depends only on the previous state and the results of a random outcome.</p> <p style="color: #224422; font-family: 'Lucida Bright', Georgia, serif; font-size: medium;"> Markov Chains are a useful way method for describing non-deterministic systems. They are useful for destribing the state and transition model of a stochastic system.</p> <p style="color: #224422; font-family: 'Lucida Bright', Georgia, serif; font-size: medium;"> As examples of Markov Chains, we discuss stop light signals, bowling, and text prediction systems in light of whether or not they can be described with Markov Chains.</p>