A Markov chain has the transition probability matrix 0.3 0.2 0.5 0.5 0.1 0.4 0.5 0.2 0.3 Given the initial probabilities ¢1 = ¢2 = 0.2 and ø3 = 0.6, what is Pr (X1 = 3, X2 = 1)? %3D

Elementary Linear Algebra (MindTap Course List)
8th Edition
ISBN:9781305658004
Author:Ron Larson
Publisher:Ron Larson
Chapter2: Matrices
Section2.5: Markov Chain
Problem 47E: Explain how you can determine the steady state matrix X of an absorbing Markov chain by inspection.
icon
Related questions
Question

Hint: The correct answer is 0.18

A Markov chain has the transition probability matrix
0.3 0.2 0.5
0.5 0.1 0.4
[0.5 0.2 0.3
Given the initial probabilities o1 = ¢2 = 0.2 and ø3
0.6, what is Pr (Xı = 3, X2 = 1)?
%3D
Transcribed Image Text:A Markov chain has the transition probability matrix 0.3 0.2 0.5 0.5 0.1 0.4 [0.5 0.2 0.3 Given the initial probabilities o1 = ¢2 = 0.2 and ø3 0.6, what is Pr (Xı = 3, X2 = 1)? %3D
Expert Solution
steps

Step by step

Solved in 2 steps

Blurred answer