- 1.1 This constitutes a Markov chain on with matrix
- from which the graph is readily deduced. The astronaut can reach any module from any module in a finite number of steps, and hence, the chain is irreducible, and as the state space is finite, this yields that there exists a unique invariant measure . Moreover, and by uniqueness and symmetry, , and hence, . By normalization, we conclude that and .
- 1.2 This constitutes a Markov chain on with matrix
- from which the graph is readily deduced. The mouse can reach one room from any other room in a finite number of steps, and hence, the chain is irreducible, and as the state space is finite, this yields that there exists a unique invariant measure . Solving a simple linear system and normalizing the solution yield .
- 1.3.a The uniform measure is invariant if and only if the matrix is doubly stochastic.
- 1.3.b The uniform measure is again invariant for for all .
- 1.3.c Then, , where is the law of the jumps.
- 1.4.a The non zero terms are and for and ,
- Any wager lost during the game is inscribed in the list and will be wagered again in the future. When the list is empty, the gambler would have won the initial sum of all the terms on the list , with the other gains cancelling precisely the losses occurred during the game.
- 1.4.b Then, can be written in terms of as
- and the Markov property for yields that the terms of this sum write
- and hence,
- where the non zero terms of are and for and for .
- 1.5.a A natural state space is the set of permutations of , of the form , which has cardinal . By definition
- Clearly, is irreducible. As the state space is finite, this implies existence and uniqueness for the invariant law . Intuition (the matrix is doubly stochastic) or solving a simple linear system shows that is the uniform law, with density .
- 1.5.b A natural state space is the set of cardinal , and
- is clearly irreducible; hence, there is a unique invariant law. The invariant law is the uniform law, with density .
- 1.5.c The characteristic polynomial of is
in which with equality on the left for . Hence, has three distinct roots
Hence, , where
If is even, then , and so that yields .
If is odd, then , and so that yields .
As computing is quite simple, this yields an explicit expression for .
The law of converges to the uniform law at rate , which is maximal for and then takes the value .
- 1.6 The transition matrix is given by
and the graph can easily be deduced from it. Clearly, is irreducible. As the state space is finite, it implies that there is a unique invariant law . This law solves
hence , then . As , normalization yields , and .
The characteristic polynomial of is
Hence, with , , and . Thus, and . The law of converges to at rate .
- 1.7.a States for , , and are wins for Player A and states for , , and are wins for Player B, and they are the absorbing states.
Let and , or and .
Considering all rallies, transitions from to have probability , from to probability , and symmetrically from to probability and from to probability .
Considering only the points scored, transitions from to have probability , from to probability , and symmetrically from to probability and from to probability .
- 1.7.b Straightforward.
- 1.7.c We use the transition for scored points. Player B wins in points if he or she scores first, and we have seen that this happens with probability .
Player B wins in points if Player A scores point and then Player B scores , if Player B scores and then Player A scores and then Player B scores , or if Player B scores points in a row, which happens with probability
Then,
The hyperbole divides the square into two subsets. In the first subset, in which , Player B should go to points (this is the largest subset and contains the diagonal, which is tangent at to the hyperbole). In the other, in which , Player B should go to points.
- 1.8.a The microscopic representation yields the macroscopic representation
- 1.8.b Synchronous: the transition from to and the transition from to have probabilities
- Asynchronous: for , the transition from to the vector in which is replaced by and, for , the transition from if to and if to the vector in which the th coordinate is replaced by and the th by , have probabilities
- The absorbing states are the pure states, constituting of populations carrying a single allele.
- 1.9.a For instance,
- 1.9.b As then , Theorem 1.(2.3 yields that is a Markov chain on with matrix given by and for . This matrix is clearly irreducible, but the state space is infinite and we cannot conclude now on existence and uniqueness for invariant law.
As an invariant measure, , satisfies the equation , which develops into
so that necessarily , and we check that . Moreover, and hence, for , which is a geometric law on .
- 1.9.c As , Theorem 1.(2.3 yields that is a Markov chain on , with matrix given by , , and for .
- 1.9.d Then, , and as , . Hence, for ,
- 1.9.e This probability is .
- 1.10.a The non zero terms of are for , , and , where is obtained from by interchanging and . The matrix is clearly irreducible. As the state space is finite, this implies that there exists a unique invariant law . Intuition (the matrix is doubly stochastic) or a simple computation shows that the uniform law with density is invariant.
- 1.10.b The non zero terms of are, for ,
- The matrix is clearly irreducible. As the state space is finite, this implies that there exists a unique invariant law . A combinatorial computation starting from yields that for . This is a hypergeometric law.
- 1.10.c The Markov property yields that
- and this affine recursion is solved by
- Moreover, and hence,
- Then, at rate .
- 1.11.a Theorem 1.(2.3 yields that this is a Markov chain.
- 1.11.b Computations, quite similar to those for branching, show that
- 1.11.c Similarly, or by a Taylor expansion,
- which takes the value if or else if .
- 1.12.a Theorem 1.(2.3 yields that this is a Markov chain. The irreducibility condition is obvious.
- 1.12.b Then, can be written, using independence, as
- 1.12.c Then, necessarily
- and thus, . For ,
- and identification of the terms yields that .
- 1.12.d As , necessarily , and if , then , and the chain cannot be irreducible (as then ) and thus and as .
- 1.12.e If , then
- and hence, , and using and identifying the terms in in the above-mentioned Taylor expansion yields that .
- 1.13.a By definition and the basic properties of the total variation norm, . For all and , if is such that , then and hence,
- so that .
- 1.13.b Then, for all laws and , all such that , and all , and
- implies that . This yields that . Then, it is a simple matter to obtain that and then that .
- 1.13.c Taking , it holds that , which forms a geometrically convergent series, hence classically is Cauchy, and as the metric space is complete, there is a limit , which is invariant. Then, .
- 1.13.d For all , , and such that , it holds that . It is a simple matter to conclude.
- 1.13.e For all , , and such that , it holds that
- It is a simple matter to conclude.