Let {xn,n≥0} be a Markov chain with states 0,1,2,3. Suppose this Markov chain is irreducible and the transition probability is Pi,j>0,i,j=0,1,2,3. Let N be the number of transitions, starting from the state 0 , until the pattern 1,2,3,1 appears. That is,

N=min{n≥4:xn-3=1,xn-2=2,xn-1=3,xn=1}.

Please define a new Markov chain to model this process and find the transition probability
matrix in terms of Pi,j.