Chapter 4

Renewal Models

Renewal theory was developed during the studies carried out on practical problems that arise from the degradation and replacement of components of complex systems. The fundamental theory can be found in [FEL 66], while [COX 62] is a monograph on the topic.

4.1. Fundamental concepts and examples

Note that, very often, to an arbitrary stochastic process there can naturally be associated sequences of i.i.d. r.v.

Let X1 ,X2,... be a sequence of real i.i.d. r.v., with the same d.f. F, F(−0) = 0, F(0) < 1, defined on a probability space image. The sequence (Snimage), where

[4.1] image

is called a renewal chain (process).

In the applications that led to the renewal theory, the r.v. S1, S2, … represent the replacement (renewal) times of worn out components, while X1, X2, … are the working times (the times between successive arrivals). The expected value μ = image (Xk), that exists and can be possibly +00, is called the mean lifetime of the components.

The counting function, i.e. the number of renewals during a time interval [0, t], is an r.v. defined by

[4.2] image

and, for n ≥ 1, we have

image

where F*n (the n-fold Stieltjes convolution of F) is the distribution function of the r.v. Sn. More precisely,

image

REMARK 4.1.–

1.In the particular case where F(t) = 1−eλt, t > 0, the counting process (N(t) – 1, t ≥ 0) is nothing but a Poisson process.

2.We consider that the time t = 0 is a renewal time, i.e. N(0) = 1, which is not the case for a Poisson process.

The expected value of the counting function at time t is called the renewal function and we denote it by U(t). We have

[4.3] image

Obviously, the mean number of renewals during the interval (a, b] is U(b)U(a).

Note that there exists at least one renewal in the time interval (0, t] if and only if X1t. Consequently, the mean number of renewals in the interval (0, t], given that X1 = yt, is 1 plus the number of renewals during the interval (y,t]. Therefore,

image

which implies that U(t) verifies the equation

[4.4] image

This equation is a particular case of what is called a renewal equation, that has the form

[4.5] image

where F is a distribution function on image, while h and g are functions that are bounded on finite intervals of image, such that g(t) = h(t) = 0, t < 0.

THEOREM 4.2.–

(a)U is a non-decreasing and right continuous function, U(0) = 1 [see [4.3]] U(t) < ∞ for all image;

(b)Let h: image h(t) = 0 for all t < 0, such that image exists and is finite for all image. Then

image

is the unique solution of the renewal equation [4.5] within the class of the functions h: image, such that h(t) = 0 for all t < 0 and h is bounded on [0, t], t > 0.

EXAMPLE 4.3.– Let us assume that F is an exponential distribution Exp(A). Then, F(dy) = λe−λy dy, y ≥ 0, and, for n ≥ 1, it can be proved by induction or by means of Laplace transform that

image (Gamma distribution),

so

image

Thus, we obtain that

image

We will assume all throughout this section that μ = image (X1) < ∞.

THEOREM 4.4.– (Strong law of large numbers) We have

[4.6] image

THEOREM 4.5.– (Central limit theorem) If image then

[4.7] image

THEOREM 4.6.– (Elementary renewal theorem) We have

[4.8] image

THEOREM 4.7.– (Blackwell’s renewal theorem) If F is not arithmetic,1 then for all h > 0 we have

[4.9] image

If F is arithmetic on {nd; nimage}, then [4.9] is valid, provided that h is multiple of d.

The Blackwell’s renewal theorem is equivalent to the key renewal theorem for directly Riemann integrable functions introduced by Feller. We shall briefly present the directly Riemann integrable (DRI) functions. Let g be a function defined on [0, +00]. For any a > 0 and positive integer n, let us denote by image the supremum and by image the infimum of g on [(n — l)a, na]. The function g is said to be DRI if image and image are finite for every a > 0, and

image

PROPOSITION 4.8.– A sufficient condition for a function g to be DRI is that

a)g(t) 0 for any t ≥ 0;

b)g is non-increasing;

c)image

THEOREM 4.9.– (Key renewal theorem) If F is not arithmetic and g: imageimage is DRI, then we have

[4.10] image

In the case where F is arithmetic of period d and image, then

[4.11] image

REMARK 4.10.– Let us consider the case where Z : image satisfies the renewal equation

image

with image a given function and L an improper distribution function (i.e. L() < 1), such that L(0) = 0. For this renewal equation, Theorem 4.9 cannot be applied.

Let us assume that there exists a k > 0 such thatimage and let

image, then ([FEL 66], vol. II, p. 362) we have image provided that the integral of the right-hand side exists.

The results of renewal theory can be used in order to prove the existence of some “limit probabilities,” that express the equilibrium state of a random system with some specific properties. More precisely, we consider a stochastic process with a countable state space 1, 2,…, so that, with probability 1, there exists a random instant S1 such that the evolution of the process after that moment is identical to the evolution of the process starting from time t = 0. For example, for a Markov process starting from a recurrent state, S1 can be the first return time to the initial state.

From all we have seen before, we obtain that there exists a sequence of r.v.

0 = S0 < S1 < S2 < S3 < ... such that Xn = SnSn-1, n ≥ 1, are i.i.d.

THEOREM 4.11.– Let Pk(t), image, be the probability that the stochastic process is in state k at time t. If μ = image, then there exist the non-negative constants pk, k image, such that

image

PROOF.– Let qk(t) be the probability that S1 > t and that the process is in state k at time t. If F is the common distribution function of the r.v. Xn, n ∈ N+, then

image

It is easy to check that Pk(t) satisfies the renewal equation

image

and, as 1 - F is a non-increasing function, from Theorem 4.9 we obtain

image

4.2. Waiting times

Let us consider a renewal process (Sn, n ∈ N) and, for an arbitrary fixed time t, let the r.v. X(t), Y (t), and L(t) be defined as follows:

1)X(t) = t — SN(t)-1, called the spent waiting time or age;

2)Y (t) = SN(t) - t, called the residual waiting time or residual lifetime;

3)L(t) = SN(t) -SN(t)-1, that is the duration between the two successive renewals containing t.

PROPOSITION 4.12.– The distribution functions of these three waiting times are given by

1) image

2) image

3) image

COROLLARY 4.13.– (Asymptotic behavior) We have

1) image

2) image

3) image

EXAMPLE 4.14.– If F is an exponential distribution Exp(λ), we have

1) image

2) image

3) image

EXAMPLE 4.15.– (Waiting time paradox) Let us suppose that metros arrive in a station according to a Poisson process of parameter A. A person arriving at the station at time t can do the following reasonings about the time Y(t) he has to wait for the arrival of the next metro:

a)As the exponential distribution “forgets the past,” the mean waiting time for the next metro, i.e. image , should be image.

b)The moment when the passenger arrives at the station is randomly chosen between two successive arrivals, so, for reasons of symmetry, the mean waiting time should be image.

On the one hand, the explanation of this paradox is that the r.v. L(t), that is the interval between two successive arrivals containing the instant t, does not follow an exponential distribution, but the distribution given in example 4.14. The mean value of this distribution is image and not image, as it was considered in (a). On the other hand, the r.v. Y(t) is exponentially distributed (see example 4.14) and we have image for t→∞.

The waiting time paradox is a particular case of the paradox of backward recurrence time. Let us assume that a component of a certain type is replaced immediately after it fails. In order to determine the mean duration of good functioning of such a component, we can proceed as follows: at time t we choose a sample of working components for which we write down the duration of good functioning from the moment they started to work (before time t) up to the first failure. The mean value observed on this sample is considered as the mean duration of good functioning of the component. In fact, this reasoning is not correct because we determine the mean value of L(t), instead of the mean value of the r.v. Xn, image+, that we are interested in. The relation between these two mean values is given, as t → ∞, by the formula

image

that is a consequence of corollary 4.13; for an exponential distribution of parameter A, this limit is 2A-1.

Another problem that arises when we deal with renewal processes is to establish the distribution of the waiting time T up to the occurrence of the first interval between two successive renewals greater than or equal to a certain given value t0 . To be more specific, T = Sn if X1, X2,..., Xn < t0 and Xn+1 ≥ t0 . Let us set T1 = T + t0 and let W be the distribution of the r.v. T1. Obviously, W(t) = 0 for all t < t0; we consider the mutually exclusive events X1 > t0 and X1 = y ≤ t0. Note that T1 = t0 in the first case, whereas in the second case the conditional distribution of T1, given that X1 = y, is W(ty). Consequently,

[4.12] image

for t ≥ t0.

Relation [4.12] can be regarded as a renewal equation of the form

[4.13] image

where

image

and

image

In the particular case where F is an exponential distribution of parameter A, the problem presented above is known as the gaps problem in a Poisson process.

In order to compute the moments of the distribution W, we consider the r.v. as the lifetime of a renewal process stopped with the probability 1 — F(to). Note also that the r.v. T satisfies equation [4.13], with z(t) = 1 — F(t0) for all image. Therefore,

image

and, in the Poisson case, we get

image

and

[4.14] image

Similarly, we obtain

[4.15] image

EXAMPLE 4.16.– Let us assume that, at a given point on a road, the successive cars moving in a single lane form a Poisson process. In order to cross the road at that given point, a pedestrian needs a time t0 between two successive passages of the cars. So, he will cross the road during the first gap of the Poisson process.

If λt0 > 1, then there exists a k > 0 such that e(k−λ)t0 = k and, for large values of t, we have (see Remark 4.10)

[4.16] image

Applying the Laplace transform in [4.12] we can obtain ([FEL 66], vol. II, p. 444)

[4.17] image

where x+ = max(0, x), x ∈ image.

REMARK 4.17.– If the cars move in two parallel lines and if for each line the successive passages form a Poisson process of parameter A, then we can consider that the successive passages of both lines form only one Poisson process of parameter 2A. In this case, the mean waiting time for crossing both lines is (e4λto − 1)/λ (see [4.14]).

If the road crossing is done with a stop between the two lines, then the mean waiting time for crossing the road will be image.

If λt0 = 1, then the mean waiting time for crossing will be 30.24to with a stop between lines and 53.6to without a stop.

4.3. Modified renewal processes

Let us consider the case where S0 > 0 is independent of Xn, n ≥ 1, and has the distribution function F0 different from F. The process (Sn, image) is called modified renewal process or delayed renewal process. If we denote by U0 the new renewal function, we can write

image

Therefore, U0 satisfies the renewal equation

[4.18] image

For h > 0 we have

[4.19] image

If F is not arithmetic, then Blackwell’s renewal theorem implies

[4.20] image

and, consequently, the number of renewals in (t, t + h] tends to the constant h/μ as t → ∞. Note also that the renewal rate does not depend on the initial distribution F0. We would like to find the initial distribution F0 such that image, i.e. so that the renewal rate is constant. From [4.18] we see that the only possible choice is

[4.21] image

In this case, the process is called stationary modified renewal process.

Another generalization can be obtained by assuming that the process can be stopped at each renewal time with probability q(0,1), or it can continue with probability p = 1 — q, independently of the evolution of the process up to that moment. Let (Tn, n ≥ 1) be a sequence of i.i.d. r.v., with

image

The sequences (Tn, n ≥ 1) and (Xn, n ≥ 1) are assumed to be independent. The r.v. Tn, n ≥ 1, can take the values 0 or 1, according to the decision made at the nth renewal to stop or to continue the renewal process. Let

image

Obviously, we have G() = p.

Consequently, the evolution of the process that can be stopped is described by a sequence of i.i.d. r.v. that can take the value ∞, with distribution function G. If Xn = ∞, then the process is stopped at the nth renewal. The distribution function G is called improper (or defective), that is to say it has an atom at infinity.

The probability that the process will not be stopped during the first n renewals is pn → 0 as n → ∞. Consequently, the process will be stopped with probability 1 after a finite number of steps. The mean number of steps up to the stopping time is

image

Note that the mean number of renewals on [0,) is U(), so we get

image

The same result can be obtained from equation [4.4], replacing F by G.

Note also that the probability that Sn ≤ t and that the process is stopped at the nth renewal is qG*n (t), which implies that the probability that the process will be stopped before t is

image

Consequently, H(t) = qU(t) is the distribution function of the lifetime of the process. This function satisfies the renewal equation

image

4.4. Replacement models

Let us denote by X1, X2,... the lifetimes of identical components. When a component fails, it is immediately replaced by an identical one. So, the positive r.v. X1,X2,... are assumed to be independent, with the same distribution function F, whose mean value is denoted by μ.

The mean number of replacements during the interval [0, t] is U(t) and, from Theorem 4.7, we obtain

image

which means that for large values oft, the replacement rate is 1. This implies that any replacement strategy that prescribes the component replacement prior to its failure will use more than 1 components per unit of time. Nevertheless, there are situations when failure in service can cause unacceptable damages (railways, aeronautical equipment, nuclear power plants, etc.).

Let us consider the age-replacement policy that consists of replacing a component either when it reaches an age T or when it fails, if this moment is prior to time T.

For large values of t, we would expect that the fraction of replaced components up to time t, due to their failure, to be F(T), and the corresponding fraction of replacements due to their age to be 1 — F(T). Let

image

and thus

image

For large values of t, the replacement rate will be 1/μT > 1.

Let Y1, Y2,... denote the durations between replacements due to component failures. It is clear that these random variables are independent and have the same distribution. We have

image

where the r.v. N denotes the number of replacements due to age. Consequently,

image

and

image

and we obtain

[4.22] image

So, for large values of t, the replacement rate due to failures is

[4.23] image

Relation [4.23] suggests that, for some distribution functions F, the rate l1(T) could be lower than l/μ. This fact is relevant in the cases where the occurrence of failures during the service implies more expensive damages than the replacement costs.

It is worth noticing that, if F is an exponential distribution function, we have μ1 = μ and the above strategy does not bring any improvement.

Let St (t) = image(Y1 > t) be the probability that there is no failure up to time t. We can easily see that St (t) has the expression

[4.24] image

If F is differentiable, then [4.24] yields for t > T

image

Consequently, imageif and only if

[4.25] image

The ratio image is called (instantaneous) failure rate, because r(t)At is the probability that the component will fail during the interval (t, t + Δt], given that it worked up to time t. As T > t − nT, relation [4.24] is satisfied for distributions with increasing failure rate r(t).

In conclusion, regarding the probability that there will be no failure up to time t for distributions with increasing failure rate, the smaller T, the larger this probability will be.

In addition, [4.22] implies that μ = μ1(). Therefore, in order to have l//xi(T) smaller than l, it suffices that the function μ1 is decreasing. We have

[4.26] image

and

image

if F has an increasing failure rate; thus, relation [4.26] shows that μ1 (T) is increasing in this case.

Another replacement strategy is that of block replacement, where all the components of a certain type are replaced either simultaneously at times kT, k = 1, 2,..., or individually when failures occur. On the one hand, the advantage of this strategy compared to the previous one is that it avoids the recording of individual failures. On the other hand, this strategy uses on average more components per time unit than the previous one because there may be components that are replaced prior to their failure or before reaching ageT.

Let us investigate now if by using this strategy we can decrease the average number of replacements due to failures. Let NB(t) and Ny(t) denote the

total number of replacements during the interval [0, t] when using the block-replacement policy and the age-replacement policy, respectively. The total number of replacements due to failures during the interval [0, t], in these two strategies, will be denoted by image (t), respectively.

THEOREM 4.18.– ([BAR 65], pp. 67-69) Fo r all n ∈ image we have

[4.27] image

If F has an increasing failure rate, then

[4.28] image

Inequality [4.27] implies that the average number of replacements during the interval [0, t] is greater for the block-replacement policy than for the replacement policy at a specified age; similarly, if F has an increasing failure rate, relation [4.28] implies that the average number of replacements due to failures during the interval [0, t] is greater for the block-replacement policy than for the age-replacement policy.

It can be proved ([BAR 65], p. 69) that equality holds in relation [4.28] for F exponential.

There are practical situations when replacements at fixed time moments are not appropriate (or even impossible), as is the case, for instance, for equipments working on random cycles, whose interruption for replacement purposes is not possible. In these cases we need a random-replacement policy. This policy is defined by a sequence of i.i.d. random variables (Yn, n ≥ 1) of d.f. G. The sequence (Yn, n ≥ 1) is assumed to be independent of the sequence (Xn, n ≥ 1) of intervals between the successive failures. If we let Zn = min(Xn, Yn), n ≥ 1, then Zn, n ≥ 1, represents the durations between successive replacements, which define a renewal process with the distribution function

image

and we have

image

If NA(t) is the total number of replacements during the interval [0, t], then applying Theorem 4.6 yields

image

For the number of replacements due to failures during the interval [0, t], denoted by N*(t), we can easily obtain

[4.29] image

The distribution function K of the time elapsed between two successive failures satisfies the renewal equation

[4.30] image

An important function from a practical point of view is the function denoted by R(x, t), representing the probability that a component operates without any replacement during the interval [t, t + x], given it started to work at time t. It can be proven ([BAR 65], p. 74) that R(x, t) satisfies the equation

image

and, from Theorem 4.11, we obtain

image

4.5. Renewal reward processes

Let us assume that when the nth renewal occurs, n ≥ 1, a reward Yn is given, such that the sequence of random variables ((Xn, Yn), n ≥ 1) is i.i.d.; therefore, Yn may depend only on Xn. The sum of the rewards up to time t is

image

The asymptotic behavior of the r.v. R(t) is given in the following result.

THEOREM 4.19.– ([ROS 70], p. image|Yn| < ∞ and imageXn < ∞, then

[4.31] image

and

[4.32] image

PROOF.– First note that

image

Second, applying the law of large numbers we have

image

and

image

so we obtain [4.31].

As N(t) + 1 is a stopping time for the sequence (Yn, n ≥ 1), by Wald’s equation we have

image

Relation [4.32] will follow from Theorem 4.6, if we show that image. To this purpose, let

image

It can be easily checked that

image

and thus g satisfies the renewal equation

[4.33] image

where image As image, it follows that limt→∞ h(t) = 0 and image

Note that the solution of equation [4.33] is (see Theorem 4.2)

image

For any > 0, there exists T > 0 such that h(t) < for all tT. Consequently,

image

Therefore

image

and we obtain

image

EXAMPLE 4.20.– The supply of a company with spare parts is done at the moment when there are N demands (N represents the capacity of one transport). It is assumed that the demands form a Poisson process of parameter A. For every demand that is not met, the company incurs a penalty equal to c per time unit. Let us denote by X1, X2,… the intervals between successive replenishment moments (the replenishment time is considered to be zero) and by Y1, Y2,… the penalties corresponding to these intervals. It is clear that

the sequence of random variables ((Xn,Yn),n ≥ 1) is a renewal reward process. Therefore, for t large enough, the average penalty per time unit will be image (see Theorem 4.19).

First, note that

image

where τi, 1 ≤ i ≤ N − 1, is the time between the ith and the (i + l)th demand. As τi, 1 ≤ iN—1, are exponentially distributed with parameter A, it follows that

image

Second, as X1 = τ1 + τ2 + … + τN, we have image Xs1 = N/λ. Thus, for t large enough, the total average penalty on the interval [0, t] is c(N — l)t/2.

4.6. The risk problem of an insurance company

Let N(t) be the number of claims incurred by an insurance company over the time interval [0, t] and let us assume that it is a Poisson process. Assume, further, that the values of the successive claims Y1, Y2,... are i.i.d. r.v. with the same distribution function G. Let us denote by c the total cash inflow of the company per unit of time. If the initial capital of the company is z, then the capital at time t is

image

We are interested in the probability that the company will not be ruined. More precisely, we will study the function

image

Obviously,

image

where X1 is the time of the first claim. Note that we have

[4.34] image

and

image

Consequently, the function R satisfies the equation

image

or, by change of variable t = z + cx,

image

Differentiating this relation we obtain

image

which leads to

image

Integrating both sides with respect to z on the interval [0, w] we obtain

image

Interchanging the orders of integration and the change of variable s = z — y yields

[4.35] image

If we let image then from [4.35] we obtain

image

where

[4.36] image

Equation [4.36] is a renewal equation corresponding to the improper density image. Note that this probability density is improper because image could have a value ≠ 1, where μ = image (Y1).

If image, then R(z) = 0, since the average claim per unit of time exceeds the income rate c.

If image then we have

image

and

image

where the constant k satisfies equation

image

and μ* is given by

image

4.7. Counter models

We consider a counter (or a recording mechanism) for the detection of the presence of a radio-active material. The impulses arrive at the counter, but some of them will not be recorded due to the counter inertia. This period of inertia is called locked time or dead time. Assume that an arriving impulse is recorded at time t = 0 and blocks the counter for a certain period. All the impulses that arrive during this period will not be recorded. The first impulse arriving after this locked period will be recorded, blocking in turn the counter, etc.

If the impulses that arrive at the counter during a locked period do not influence this locked time, then the counter is said to be of type I. On the contrary, if each arriving impulse has an influence on the locking time of the counter (even if it arrives at a moment when the counter is already locked), then the counter is said to be of type II.

4.7.1. Type I counters

Let us assume that the impulses arrive at the counter at times S0 = 0, S1, S2,… and that at time t = 0 the counter is not locked. Let Xn = Sn — Sn-1, n ≥ 1, be the successive interarrival times of the particles and Y0 , Y1, Y2,… be the successive locking times determined by the recorded particles. We make the following assumptions:

(i) The sequence (Xn,n ≥ 1) is a sequence of i.i.d. r.v., with common d.f. F,with F(0) = 0.

(ii) The sequence (Yn,n ≥ 1) is a sequence of i.i.d. r.v., with common d.f. G,with G(0) = 0.

(iii) The sequences of r.v. (Xn,n ≥ 1) and (Yn,n ≥ 1) are independent.

Let image image (N(t)). Obviously, N(t) + 1 is the number of particles arrived within the interval [0, t], t > 0.

Let us now introduce the sequence of non-negative integer numbers (ni, jimage), defined by the relations

image

Consequently, the successive registering times are image image and the elapsed times between successive registrations are

image

From the definition of the sequence of r.v. (nj, jimage) we see that (nj − nj-1, jimage) is a sequence of independent r.v. with the same distribution. Consequently, the sequence (Zj, jimage) is also a sequence of independent r.v. with the same distribution. Let us also define Tn = Z1 + … + Zn, n ≥ 1, M(t) = max{n | Tnt}, and V(t) = image(M(t)). Obviously, M(t) is the number of impulses registered within the interval [0, t].

It is worth noticing here that the ratio B(t) = (1 + U(t))/(1 + V(t)) is an indicator of the counter’s quality.

Let us denote by Wn the time interval after the nth registering when the counter is not locked, i.e. Wn = Zn − Yn−1. The sequence (Wn, n ≥ 1) is a sequence of random variables with the same distribution. The following result concerns their common distribution function K.

THEOREM 4.21.– ([PRA 65b], p. 177) We have

[4.37] image

PROOF.– For Y0 = t (fixed) we have W1 = SN(t)+1 - t, whose conditional distribution function, given Y0 = t, is (see Proposition 4.12)

[4.38] image

Integrating this relation with respect to the distribution of Y0 we obtain [4.37].

From the definition of the variables Zj, j ≥ 1, we see that they can take only finite values. The following theorem presents some properties of these variables.

THEOREM 4.22.– ([PRA 65b], p. 178) We have:

[4.39] image

[4.40] image

PROOF.– From relation Zj = WjYj-1, the following equalities follow

image

and relation [4.39] is proved.

On the one hand, from [4.39], for θ > 0 we obtain

image

where image is the Laplace transform of F. On the other hand, from the renewal equation [4.5] it follows

image

and

[4.41] image

Using the properties of the Laplace transform, from [4.41] we obtain

image

since image.

In order to prove relation [4.40], note first that max(Y0, X1)Z1Y0 + Xn and that image (Z1) < ∞ if and only if μ, v < ∞. Second, from the definition of n1 we have image (n1 | Y0 = t) = 1 + U(t). Using Wald’s identity it follows that

image

Therefore,

image

If μ < ∞, note that

[4.42] image

Indeed,

image

and we obtain [4.42] taking into account [4.40].

Let P(t) be the probability that at time t the counter is locked. The following result gives an expression of P and its asymptotic behavior.

THEOREM 4.23.– We have

[4.43] image

and, if ν < ∞, then

[4.44] image

PROOF.– From the definition of P(t) we have

image

Since the function 1 - G(t), t ∈ R+, is non-negative, non-increasing, and improper Riemann integrable over image+, relation [4.44] is obtained from [4.43], applying Theorem 4.9.

In the following, we will illustrate the results presented above using some particular cases.

1. Poisson arrivals. In this case, F(x) = 1 − eλx, λ > 0, ximage+, and we obtain from [4.38]

image

as well as

image

image

and

image

The Laplace transform of the function H is

image

Since image, we get

image

and, using the fact that F is exponential, we have

image

which yields

image

that is

image

Consequently,

image

and thus

image

Note that in this case U(t) = At, that implies

image

as well as

image

and

image

To conclude, relation [4.44] becomes

image

2. Counters with constant locking times. Let us assume that the locking time has a constant value d; in this case, the distribution function G has the form

image

From relation [4.41] we obtain

image

which implies

image

and, since image, we obtain

image

Moreover, for μ < ∞, relation [4.42] leads to

image

and [4.43] and [4.44] become, respectively,

image

and

image

3. Counters with exponential locking times. By assuming that G(t) = 1 — e-βt, β > 0, timage+, relation [4.41] becomes

image

and relation [4.42] leads to

image

4.7.2. Type II counters

Let us assume now that the particles arrive at the counter following a Poisson process with rate A; let X1, X2,… be the successive interarrival times and Y1, Y2,… be the successive locking times determined by the arriving particles. The assumption of Poisson arrivals implies that the r.v. X1, X2, … are independent and exponentially distributed of parameter A. We will also make the following assumptions:

(i) The sequaence (Yn, n ≥ 1) is a sequence of i.i.d. r.v. with common d.f. G.

(ii) The sequences of r.v. (Xn, n ≥ 1) and (Yn, n ≥ 1) are independent.

(iii) A particle arrives and it is recorded at time t = 0.

As opposed to type I counters, where the particles arriving during a locked period of the counter have no influence on the counter, for the type II counters a particle arriving when the counter is locked causes a new locking of the counter. For instance, the particle arrived at time t = 0 locks the counter for a time Y1; if there is another particle that arrives during the interval [0, Y1], which means that Y1 > X1, then this second particle prompts a new locking of the counter of duration Y2, such that the counter is locked for a period equal to max(Y1,X1 + Y2) .

As we did for type I counters, let us denote by P(t) the probability that the counter is locked at time t. We will prove relation

[4.45] image

Note that image [the counter is not locked at time t | n particles arrived during the interval (0, t)] x (λt)neλt/n!. If we work under the assumption that there were n particles that arrived during the time interval [0,t], then the common distribution of the arrival times is the same as the common distribution of the order statistics for a sample of size n from the uniform distribution over [0,t] (see Corollary 2.11). Consequently, the probability that the length of locking times altogether ends at time t is

image

Therefore,

image

Moreover, we note that

[4.46] image

If we denote by M(t) the mean number of particles registered during the interval (0, t], we will prove the relation

[4.47] image

First, note that the probability that a particle arrives during the interval (t, t + At] is At + o(At); second, the probability that this particle does not find the counter locked during this interval is 1 — P(t) + o(At). Consequently, the probability of a recording during the time interval (t, t+Δt] is λΔt(l—P(t)) + o(Δt) and thus

image

which yields the differential equation

[4.48] image

Integrating [4.48] we obtain [4.47].

4.8. Alternating renewal processes

Let us consider a system whose state space contains two states, denoted by 0 and 1; assume that these states represent the working state and respectively the repair state of the system. Initially, the system is in state 0 and it stays there a random time X1; then it switches to state 1 where it stays for a time Y, afterward it moves to state 0 where it remains for a time X2, etc. Let us make the following assumptions:

a) (Xn, n ≥ 1) is a sequence of i.i.d. r.v. with common d.f. F.

b) (Yn, n ≥ 1) is a sequence of i.i.d. r.v. with common d.f. G.

c) The sequences (Xn,n ≥ 1) and (Yn, n ≥ 1) are independent.

It is clear that the sequence (Xn + Yn, n ≥ 1) is a sequence of i.i.d. r.v. with common d.f. H = F * G. If we assume that H is not arithmetic, then

[4.49] image

where P(t) is the probability that the system is in state 0 at time t. To see that this relation holds, note first that

image

Second, we have

image

and we obtain

image

since Y1 0 a.s. Therefore, the function P(t) satisfies the renewal equation

image

whose solution is

[4.50] image

with U(x) = Σn≥1 H*n(x). Applying Theorem 4.9 to [4.50], we get [4.49].

Let us present now how this result can be used for estimating the parameter λ of the Poisson process describing the particles’ arrival at type I and type II counters. For this purpose, let us note that the succession of locked times Bn and non-locked times K n, n=1, 2…, forms an alternating renewal process, whose second component Ln has an exponential distribution of parameter A (due to the memoryless property of the exponential distribution). If Nz(t) denotes the number of renewals of the process Zn = Bn + Ln, n ≥ 1, during the interval [0, t], we have (see Theorem 4.5)

[4.51] image

where Y1 is the locking time generated by the first particle.

Thus, for large values of t, relation [4.51] allows to estimate the parameter λ as follows:

image

4.9. Superposition of renewal processes

For every n ≥ 1, let us consider knimage+ sequences of i.i.d. r.v. Additionally, let us assume that the kn sequences are independent and have the distribution function Fni, i = 1, 2,..., kn. If we denote by Nni(t), i = 1,2,...,kn, the number of renewals within the interval [0, t] for the ith sequence, then the superposition Nn of these renewal processes is defined by

image

which is an r.v. that counts the total number of renewals up to time t for all the kn renewal processes.

Note that, in general, there does not exist a renewal process whose number of renewals during the interval [0,t] is Nn(t), because, in the general case, the intervals between two successive renewals when considering all the kn processes do not follow the same distribution and are not independent. Nevertheless, in some particular cases that we will present in the following two theorems, Nn(t) is a renewal process.

Let us consider N1(t) and N2(t), t ∈ image+, two independent renewal processes with the same d.f. F of the intervals between two successive renewals.

THEOREM 4.24.– If image and if N(t) = Ni(t) + N2(t), timage+, is a renewal process, then N1(t), N2(t), and N(t) are Poisson processes.

PROOF.– Let us denote by H the distribution function of the intervals between two successive renewals for the renewal process N(t). Then

[4.52] image

Let Y1(t), Y2(t), and Y (t), t ∈ R+, be the processes defined in section 4.2, corresponding to the processes N1(t), N2(t), and N(t), respectively. Since Y (t) = min(Y1(t), Y2(t)), we have

image

From corollary 4.13, for t → ∞, we obtain

[4.53] image

where image. Differentiating2 [4.53] with respect to x we obtain

image

so

[4.54] image

If we set G(x) = 1 - F(x) and we differentiate [4.54], we obtain the differential equation

[4.55] image

with the initial condition G(0) = 1. The solution of [4.55] is

image

where image

Let us assume now that (N1(t), timage+) is a Poisson process and that (N2(t), timage+) is a renewal process independent of (N1(t),timage+), with the distribution function of the time between two successive renewals denoted by G.

THEOREM 4.25.– If image and if N(t) = Ni(t) + N2(t), timage+, is a renewal process, then (N2(t),timage+) and (N(t),timage+) are Poisson processes.

PROOF.– Let H be the distribution function of the inter-renewal times of the process (N(t), timage+). We have

[4.56] image

and, following the same procedure as that used to obtain relation [4.53], we get

[4.57] image

where image. Differentiating relation [4.57] yields

image

and, taking into account [4.56], it follows

[4.58] image

If we let λ1 = λ(μ/ν — 1) and F(x) = 1 — G(x), ximage+, differentiating relation [4.58] we obtain the differential equation

image

whose solution is F(x) = e−λ1x, ximage+, and we finally obtain

image

The superposition of a large number of renewal processes can be estimated by a Poisson process, under some conditions that are detailed in the next result.

THEOREM 4.26.– ([KAR 75], p. 223) For any n ∈ N+, let us consider the independent renewal processes (Nni(t), t ∈ R+), with the corresponding distribution functions Fni, i = 1, 2, … , kn. We assume that

image

Then

image

if and only if

image

Note that the hypotheses of this theorem are satisfied in the particular case of Fni(t) = F(t/n), kn = n, where F is a distribution function with the properties F(0) = 0 and F′(0) = A > 0. Indeed,

image

and

image

EXAMPLE 4.27.– Let us consider a system whose components satisfy the following assumptions:

1)The uninterrupted working times of the components are independent r.v.

2)The failure of a component determines the failure of the whole system (series systems).

3)Each component that breaks down is instantaneously replaced.

We are interested in the distribution of the time between two successive failures of the system. If the number of components is large enough and the distributions of the uninterrupted working times of the components satisfy the conditions of Theorem 4.26, then we can consider that the successive failures of the system form a Poisson process.

4.10. Regenerative processes

Let us consider a stochastic process (X(t),timage+) taking values in image (we can also consider the case image) and a renewal process (Sn,nimage). We suppose that at instants Sn, the process (X(t),timage+) regenerates (probabilistically, it starts anew), i.e. it behaves exactly as after time t = 0. To be more specific, we assume the following:

1) After each Sn, the process (X(t),timage+) has the same finite-dimensional distributions as those of the entire process, i.e. for all n,kimage+, 0 < t1 << tk,tiimage+, 1 ≤ ik, the random vectors (X(Sn + ti), 1 ≤ i ≤ k) and (X(ti), 1 ≤ i ≤ k) have the same distributions.

2)For each Sn, the process (X(t + Sn),timage+) is independent of (S0, S1,..., Sn).

The part (X(t), t ∈ [Sn, Sn+1)) of the sample path is considered as the nth cycle of the process.

EXAMPLE 4.28.– Let us consider a Markov chain (Xn, nimage) with values in image. For any jimage, the return times to state j are regeneration moments. If the chain is recurrent, then the renewal process is not stopped.

Let F be the distribution function of S1 (S0 = 0) and, for A ∈ B(image), let image. We have

[4.59] image

The second term of the right-hand member of [4.59] becomes

image

Thus [4.59] shows that Z(t) satisfies the renewal equation

[4.60] image

with the solution

image

THEOREM 4.29.– ([SMI 55]) Let Aimage be fixed and assume that the function K(·, A) is directly Riemann integrable. Then, if we note μ = image(S1), we have

image

with the convention 1/∞ = 0.

PROOF.– We apply Theorem 4.9 and obtain

image

EXAMPLE 4.30.– (M/G/1 queue) Assume that the arrival process is Poisson with parameter (intensity) λ. The service discipline is “first come, first served” and the service time is a non-negative r.v. with distribution function G. A client arriving at time t = 0 initiates a service time that continues up to the moment when the first client and all his descendants are served, i.e. up to the moment when there is no client left in the queue and the service station can take a break. The following clients arrive after a random time exponentially distributed of parameter λ, and the service process starts again. If Q(t) denotes the number of clients in the system at time t, then (Q(t), timage+) is a regenerative process, where the regeneration epochs (Sn) are the moments when service times begin. From Theorem 4.29 we see that

image

The length of a cycle is the sum of the service time (ST) and the break time (exponential of parameter A); the occupation time of 0 is just the break time. Therefore, we obtain

image


1. A function F: image for all t < 0, is said to be arithmetic if there exists a constant d > 0 such that F is constant on any open interval of the type (nd, (n + l) d) (lattice of period d).

2. In order to simplify our presentation, we assume that F is differentiable, but the result holds also in the general case.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.128.173.53