A Markov chain has the transition probability matrix 0.2 0.6 0.2] 0.5 0.1 0.4 0.1 0.7 0.2| In the long run, what proportion of time is spent by the Markov chain in state 1?

Elementary Linear Algebra (MindTap Course List)
8th Edition
ISBN:9781305658004
Author:Ron Larson
Publisher:Ron Larson
Chapter2: Matrices
Section2.5: Markov Chain
Problem 47E: Explain how you can determine the steady state matrix X of an absorbing Markov chain by inspection.
icon
Related questions
Question

4

A Markov chain has the transition probability matrix
0.2 0.6 0.2]
0.5 0.1 0.4
0.1 0.7 0.2|
In the long run, what proportion of time is spent by the Markov chain in state 1?
Transcribed Image Text:A Markov chain has the transition probability matrix 0.2 0.6 0.2] 0.5 0.1 0.4 0.1 0.7 0.2| In the long run, what proportion of time is spent by the Markov chain in state 1?
Expert Solution
steps

Step by step

Solved in 2 steps with 2 images

Blurred answer