A probability transition matrix P on a state space S is called doubly stochas- tic if its column sums are all 1, i.e., Σies P(i, j) = 1 for every j Є S. (a) If S is finite and P is doubly stochastic, show that all states of the Markov chain are positive recurrent. (b) If, in addition to the assumptions in part (a), P is also irreducible and aperiodic, then deduce that 1 Pn (i, j) → |S| as n→ ∞. What does this tell you about equilibrium distributions of the Markov chain? (c) If S is countably infinite and P is an irreducible, doubly stochastic transition matrix, show that either all states are null-recurrent or all states are transient. What does this tell you about equilibrium distributions of the Markov chain?

Elementary Linear Algebra (MindTap Course List)
8th Edition
ISBN:9781305658004
Author:Ron Larson
Publisher:Ron Larson
Chapter2: Matrices
Section2.5: Markov Chain
Problem 49E: Consider the Markov chain whose matrix of transition probabilities P is given in Example 7b. Show...
Question
A probability transition matrix P on a state space S is called doubly stochas-
tic if its column sums are all 1, i.e., Σies P(i, j) = 1 for every j Є S.
(a) If S is finite and P is doubly stochastic, show that all states of the Markov chain
are positive recurrent.
(b) If, in addition to the assumptions in part (a), P is also irreducible and aperiodic,
then deduce that
1
Pn (i, j) →
|S|
as n→ ∞. What does this tell you about equilibrium distributions of the Markov
chain?
(c) If S is countably infinite and P is an irreducible, doubly stochastic transition
matrix, show that either all states are null-recurrent or all states are transient.
What does this tell you about equilibrium distributions of the Markov chain?
Transcribed Image Text:A probability transition matrix P on a state space S is called doubly stochas- tic if its column sums are all 1, i.e., Σies P(i, j) = 1 for every j Є S. (a) If S is finite and P is doubly stochastic, show that all states of the Markov chain are positive recurrent. (b) If, in addition to the assumptions in part (a), P is also irreducible and aperiodic, then deduce that 1 Pn (i, j) → |S| as n→ ∞. What does this tell you about equilibrium distributions of the Markov chain? (c) If S is countably infinite and P is an irreducible, doubly stochastic transition matrix, show that either all states are null-recurrent or all states are transient. What does this tell you about equilibrium distributions of the Markov chain?
Expert Solution
trending now

Trending now

This is a popular solution!

steps

Step by step

Solved in 2 steps

Blurred answer