A system (for instance a processor) processes jobs (such as computations) in synchronized manner. A waiting room (buffer) allows to store jobs before they are processed. The instants of synchronization are numbered and denotes the number of jobs in the system just after time .
Between time and time , a random number of new jobs arrive, and up to a random number of the jobs already there can be processed, so that
In a simple special case, there is an integer s.t. , and it is said that the queue has servers.
The r.v. are assumed to be i.i.d. and independent of , and then Theorem 1.2.3 yields that is a Markov chain on . We assume that
so that there is a closed irreducible class containing , and we consider the chain restricted to this class, which is irreducible. We further assume that and are integrable, and will see that then the behavior of depends on the joint law of essentially only through .
For , we have and hence, , and a.s., and , and thus by dominated convergence (Theorem A.3.5). We deduce a number of facts from this observation.
and the chain with arrival and potential departures given by is transient, and as then is transient.
In Theorems 3.3.3 and 3.3.4, the hypotheses are quite natural, but the hypotheses controlling the jump amplitudes cannot be suppressed (but can be weakened).
We will see this on an example. Consider the queuing system with
Then, is s.t. and for and is a random walk on reflected at . For and with and , it holds that and and, for ,
so that hypothesis of Theorem 3.3.4 is satisfied, and if , then
so that hypothesis of Theorem 3.3.3 is satisfied.
Nevertheless, and we have seen that is positive recurrent for , null recurrent for , and transient for .
The hypothesis in Theorem 3.3.5 and hypothesis in Theorem 3.3.6 are also quite natural. The latter is enough to obtain the bound for , but an assumption such as hypothesis must be made to conclude to positive recurrence. For instance, let be a Markov chain with matrix s.t.
For and , it holds that for , and hypothesis of Theorem 3.3.6 holds, but the chain is null recurrent as
Let be a Markov chain, an integer, and for . Then,
and corresponds to a time-inhomogeneous Markov chain with transition matrices given by
This chain is homogeneous if and only if is an invariant law, that is, if and only the chain is at equilibrium.
Let be a transition matrix with an invariant law . Let have law , let and be Markov chains of matrices and , which are independent conditional on , and let . Then, is a stationary Markov chain in time with transition matrix , called the doubly stationary Markov chain of matrix , which can be imagined to be “started” at .
The equality holds if and only if solves the local balance equations (3.2.5).
Then, the chain and its matrix are said to be reversible (in equilibrium), and to be a reversible law for . The equations (3.2.5) are also called the reversibility equations and their nonnegative and nonzero solutions the reversible measures.
In equilibrium, the probabilistic evolution of a reversible chain is the same in direct or reverse time, which is natural for many statistical mechanics models such as the Ehrenfest Urn.
We gather some simple facts in the following lemma.
This allows to understand the relations between Theorems 3.2.3 and 3.3.1. Let be an irreducible recurrent transition matrix on and an arbitrary canonical invariant law.
If is a superinvariant measure for , then the function is nonnegative and superharmonic for , which is irreducible recurrent, and Theorem 3.3.1 yields that is constant, that is, Theorem 3.2.3.
Conversely, if nonnegative and superharmonic for , then is a superharmonic measure for , and Theorem 3.2.3 yields that is constant, that is, the “only if” part of Theorem 3.3.1.
The following result can be useful in the (rare) cases in which a matrix is not reversible, but its time reversal in equilibrium can be guessed.
A Markov chain on (or on an interval of ) s.t.
is called a birth-and-death chain. All other terms of are then zero, this matrix is determined by the Birth-and-death probabilities
which satisfy , and its graph is given by
Several of the examples we have examined are birth-and-death chains: Nearest-neighbor random walks on , Nearest-neighbor random walks reflected at on , gambler's ruin, and macroscopic description for the Ehrenfest Urn on .
The chain is irreducible on if and only if
Similarly, is a closed irreducible class if and only if
and is a closed irreducible class if and only if
In these cases, the restriction of the chain is considered. It can be interpreted as describing the evolution of a population that can only increase or decrease by one individual at each step, according to whether there has been a birth or a death of an individual, hence the terminology.
We now give several helpful results, among which a generalization of the gambler's ruin law.
If the birth-and-death chain is irreducible on for a finite , then it is always positive recurrent, and the invariant law is obtained by replacing by in the formula.
Several exercises of Chapter 1 involve irreducibility and invariant laws.
3.1 Generating functions and potential matrix Let be a Markov chain on with matrix . For and in , consider the power series, for ,
Compute for and . Compute and then . When is this random walk recurrent?
3.2 Symmetric random Walks with independent coordinates Let be the transition matrix of the symmetric random walk on , given by . Use the Potential matrix criterion to prove that and are recurrent and is transient.
3.3 Lemma 3.1.3, alternate proofs Let be a Markov chain on with matrix , and in be s.t. is recurrent and .
Deduce from this that and then that . Conclude that .
3.4 Decomposition Find the transient class and the recurrent classes for the Markov chains with graph (every arrow corresponding to a positive transition probability) and matrices given by
3.5 Genetic models, see Exercise 1.8 Decompose each state space into its transient class and its recurrent classes. Prove without computations that the population will eventually be composed of individuals having all the same allele, a phenomenon called “allele fixation.”
3.6 Balance equations on a subset Let be a transition matrix on . Prove that for any subset of an invariant measure satisfies the balance equation
3.7 See Lemma 3.1.1 Let be a Markov chain on having an invariant law . Prove that . Deduce from this that or .
3.8 Induced chain, see Exercise 2.3 Let be an irreducible transition matrix.
3.9 Difficult advance Let be the Markov chain on with matrix given by for and for , with and .
3.10 Queue with servers, 1 The evolution of a queue with servers is given by a birth-and-death chain on s.t. and for , with and . Let .
3.11 ALOHA The ALOHA protocol was established in 1970 in order to manage a star-shaped wireless network, linking a large number of computers through a central hub on two frequencies, one for sending and the other for receiving.
A signal regularly emitted by the hub allows to synchronize emissions by cutting time into timeslots of same duration, sufficient for sending a certain quantity of bits called a packet. If a single computer tries to emit during a timeslot, it is successful and the packet is retransmitted by the hub to all computers. If two or more computers attempt transmission in a timeslot, then the packets interfere and the attempt is unsuccessful; this event is called a collision. The only information available to the computers is whether there has been at least one collision or not. When there is a collision, the computers attempt to retransmit after a random duration. For a simple Markovian analysis, we assume that this happens with probability in each subsequent timeslot, so that the duration is geometric.
Let be the initial number of packets awaiting retransmission, be i.i.d., where is the number of new transmission attempts in timeslot , and be i.i.d., where with probability if the -th packet awaiting retransmission after timeslot undergoes a retransmission attempt in timeslot , or else . All these r.v. are independent. We assume that
and that is an irreducible Markov chain on .
3.12 Queuing by sessions Time is discrete, jobs arrive at time in i.i.d. manner, and and . Session is devoted to servicing exclusively and exhaustively the jobs that were waiting at its start, and has an integer-valued random duration , which conditional on is integrable, independent of the rest, and has law not depending on .
3.13 Big jumps Let be a Markov chain on with matrix with only nonzero terms and .
3.14 Quick return Let be a Markov chain on with matrix . Assume that there exists a nonempty subset of and and s.t. if , then and . (We may assume that vanishes on .) Let .
Prove that there exists and s.t. . Find a finite subset of and a function satisfying the above.
3.15 Adjoints Let be the transition matrix of the nearest-neighbor random walk on , given by and . Give the adjoint transition matrix with respect to and the adjoint transition matrix with respect to the uniform measure.
3.16 Labouchère system, see Exercises 1.4 and 2.10 Let be the random walk on with matrix given by and .
3.17 Random walk on a graph A discrete set is furnished with a (nonoriented) graph structure as follows. The elements of are the nodes of the graph, and the elements of are the links of the graph. The neighborhood of is given by , and the degree of is given by its number of neighbors . It is assumed that for every . The random walk on the graph is defined as the Markov chain on s.t. if , then is chosen uniformly in .
Find two (nonproportional) invariant measures for the random walk. Prove that the random walk is transient.
3.18 Caricature of TCP In a caricature of the transmission control protocol (TCP), which manages the window sizes for data transmission in the Internet, any packet is independently received with probability , and the consecutive distinct window sizes (in packets) for constitute a Markov chain on , with matrix with nonzero terms and .
3.133.130.199