As I am taking Stochastic Processes class at Tulane University, I want to highlight some definitions from the textbook Essentials of Stochastic Processes (3rd Edition) by Rick Durrett (Springer, 2016)
pn+m(i,j) is the (n+m) step transition probability.
Theorem 1.2 Strong Markov Property
Suppose T is a stopping time. Given that T=n and XT=y, and other information about X0,X1,…,Xn is irrelevant for predicting the future. Then XT+k,k≥0 behaves like a Markov Chain with initial state y.
Lemma 1.3
Suppose Px(Ty≤k)≥α>0 for all x in the state space S for a Markov Chain Xn. Then
Px(Ty>nk)≤(1−α)n
Lemma 1.4
If x→y and y→z, then x→z
Theorem 1.5
If ρxy>0 but ρyx<1, then x is a transit state.
Lemma 1.6
If x is recurrent and ρxy>0, then ρxy=1.
Theorem 1.7
If C is a finite closed and irreducible set, then all states in C are recurrent.
Theorem 1.8
If the state space S of a Markov Chain is finite, then S can be written as the disjoint union T∪R1∪⋯∪Rk, where T is the set of transient states and Ri,1≤i≤k are closed irreducible sets of recurrent states.
Lemma 1.9
If x is recurrent and x→y, then y is recurrent.
Lemma 1.10
In a finite closed set, there must be at least one recurrent state.
Lemma 1.11
ExN(y)=1−ρyyρxy
where ExN(y) denotes the expected number of visits to state y beginning at state x.
A transition matrix P is doubly stochastic if its columns sum to 1. ∑xp(x,y)=1
Theorem 1.14
If P is a doubly stochastic transition matrix for a Markov chain over N states, then the uniform distribution π(x)=N1,x=1,…,N is a stationary distribution.
Consider sampling from some distribution π. If we begin in state X, we will jump to another state with probability
r(x,y)=min{π(x)q(x,y)π(y)q(y,x),1}
where q(x,y) is a jumping distribution The transition probability from the chain is thus
p(x,y)=q(x,y)r(x,y)
Suppose that π(y)q(x,y)>π(x)q(x,y), then r(x,y)=1, so π(x)p(x,y)=π(x)q(x,y)1=π(x)q(x,y) and π(y)p(y,y)=π(y)q(y,x)r(y,x)=π(y)q(y,x)π(y)q(y,x)π(x)q(x,y)=π(x)q(x,y), so we see that the detailed balance condition is satisfied.
To generate a sample, we can run the chain for a sufficient period of time to reach an equilibrium.
Consider an irreducible Markov chain with state space S. We say that for an irreducible Markov Chain with state space s, the cycle condition is satisfied if, given a cycle of states x0,x1,…,xn−1,xn with p(xi−1,xi)>01≤i≤n, we have
i=1∏np(xi−1,xi)=i=1∏np(xi,xi−1)
Theorem 1.16
There is a stationary distribution that satisfies the detailed balance condition if and only if the cycle condition holds.
If y is a transient state, then we know that ∑n=1∞pn(x,y)<∞ for any state x \textbf{Lemma 1.11} so limn→∞pn(x,y)=0. So we can focus on studying recurrent states.
Lemma 1.17
If ρxy>0 and ρyx>0, the x and y have the same period.
Lemma 1.18
If p(x,x)>0, then x has period 1.
4 key assumptions about a chain with transition matrix $P$
I: P is irreducible
A: All states have period 1 (aperiodic)
R: All states are recurrent
S: There is a stationary distribution π
Theorem 1.19 Convergence Theorem
Suppose I,A,S hold, then as n→∞, pn(x,y)→π(y)
Theorem 1.20 Asymptotic Frequency
Suppose I,R hold, let Nn(y) be the number of visits to y up to time n, then nNn(y)→EyTy1
Theorem 1.21
If I and S hold, then π(y)=EyTy1, and hence the stationary distribution is unique.
Theorem 1.22
Suppose I,S hold, and ∑x∣f(x)∣⋅π(x)<∞ for some f(x), then
n1m=1∑nf(xm)→xx∑f(x)π(x)
Theorem 1.23 Convergence Theorem
Suppose I,S hold. Then
n1m=1∑npm(x,y)→π(y)
Theorem 1.24
Suppose p is irreducible and recurrent. Let x∈S and let Tx=inf{n≥1:Xn=x}
μx(y)=n=0∑∞Px(Xn)=y,Tx>n)
defines a stationary measure with 0<μx(y)<∞ for all y.
Consider a Markov chain with state space S. Let A and B be subsets of S, so that C=S−(A∪B) is finite. Suppose h(a)=1 for a∈A,h(b)=0 for b∈B, and that for x∈C we have