Markov models :
1. Draw the transition digraph corresponding to the stochastic matrix
A = ( 0,6 0,4)
(0,5 0,5)
Label the edges with the transition probabilities.
2. compute the probability X= {sunny,cloudy, snowing, rainy}. There is a formula for that.
3. A particle moves on a circle through points which have been marked 0, 1, 2, 3, 4 (in a clockwise order). At each step, it has a probability (p) of moving to the right and (1 − p) to the left. Let Xn denote its` location on the circle after the nth step. The process {Xn, where n equals or more than 0} is a Markov chain. a) Find the transition probability matrix. b) Is this markov chain irreducible?
4. Each morning an individual leaves his house and goes for a run. He is equally likely to leave either from his front or back door. Upon leaving the house, he chooses a pair of running shoes (or goes running barefoot if there are no shoes at the door from which he departed). On his return he is equally likely to enter and leave his running shoes, either by the front or back door. If he owns a total of k pairs of running shoes, what proportion of the time does he run barefooted? Write transition probabilities matrix for this task.
5. Having a transition matrix:
A B C D
A 1 0 0 0
B .9 0 .1 0
C 0 .9 0 .1
D 0 0 0 1
a) Give a diagraph of the Markov chain. Vertices are states, edges should be labeled with transition probabilities.
b) If we start in state B, what is the probability of being in state A after 8 transitions? (If you can't do the matrix multiplications, at least write out the matrix expression indicating the answer.)
c) If we start in state B, what is the probability of being in state D after 4 transitions?
d) If we start in state C, what is the probability of being in state A after 16 transitions? What is the probability of being in state D?