4
MULTIPLE RANDOM VARIABLES

4.1 INTRODUCTION

In many experiments an observation is expressible, not as a single numerical quantity, but as a family of several separate numerical quantities. Thus, for example, if a pair of distinguishable dice is tossed, the outcome is a pair (x, y), where x denotes the face value on the first die, and y, the face value on the second die. Similarly, to record the height and weight of every person in a certain community we need a pair (x, y), where the components represent, respectively, the height and weight of a particular individual. To be able to describe such experiments mathematically we must study the multidimensional random variables.

In Section 4.2 we introduce the basic notations involved and study joint, marginal, and conditional distributions. In Section 4.3 we examine independent random variables and investigate some consequences of independence. Section 4.4 deals with functions of several random variables and their induced distributions. Section 4.5 considers moments, covariance, and correlation, and in Section 4.6 we study conditional expectation. The last section deals with ordered observations.

4.2 MULTIPLE RANDOM VARIABLES

In this section we study multidimensional RVs. Let (Ω, images, P) be a fixed but otherwise arbitrary probability space.

From now on we will restrict attention mainly to two-dimensional random variables. The discussion for the n-dimensional images case is similar except when indicated. The development follows closely the one-dimensional case.

c4-fig-0001

Fig. 1

Then F satisfies both (i) and (ii) above. However, F is not a DF since

images

Let images and images. We have

images

for all pairs (x1,y1), (x2,y2) with images (see Fig. 2).

The “if” part of the theorem has already been established. The “only if” part will not be proved here (see Tucker [114, p. 26]).

Theorem 2 can be generalized to the n-dimensional case in the following manner.

c4-fig-0002

Fig. 2 images

We restrict ourselves here to two-dimensional RVs of the discrete or the continuous type, which we now define.

c4-fig-0003

Fig. 3 images.

Let (X, Y) be a two-dimensional RV with PMF

images

Then

(6)images

and

(7)images

Let us write

(8)images

Then images and images, images and images, and {pi·}, {pj·} represent PMFs.

If (X, Y) is an RV of the continuous type with PDF f, then

and

satisfy images, and images, images. It follows that f1(x) and f2(y) are PDFs.

c4-fig-0004

Fig. 4 images.

In general, given a DF F(x1, x2, …, xn) of an n-dimensional RV (X1, X2, …, Xn), one can obtain any k-dimensional images marginal DF from it. Thus the marginal DF of (Xi1, Xi2, …, Xik), where images, is given by

images

We now consider the concept of conditional distributions. Let (X, Y) be an RV of the discrete type with PMF images. The marginal PMFs are images Recall that, if images and images, the conditional probability of A, given B, is defined by

images

Take images and images, and assume that images Then images and

images

For fixed j, the function images and images Thus images, for fixed j, defines a PMF.

Next suppose that (X, Y) is an RV of the continuous type with joint PDF f. Since images for any x,y, the probability images or images is not defined. Let images, and suppose that images. For every x and every interval images, consider the conditional probability of the event images, given that images. We have

images

For any fixed interval images, the above expression defines the conditional DF of X, given that images, provided that images. We shall be interested in the case where the limit

images

exists.

Suppose that (X, Y) is an RV of the continuous type with PDF f. At every point (x, y) where f is continuous and the marginal PDF images and is continuous, we have

images

Dividing numerator and denominator by 2ε and passing to the limit as images, we have

images

It follows that there exists a conditional PDF of X, given images, that is expressed by

images

We have thus proved the following theorem.

It is clear that similar definitions may be made for the conditional DF and conditional PDF of the RV Y, given X = x, and an analog of Theorem 6 holds.

In the general case, let (X1, X2, …, Xn) be an n-dimensional RV of the continuous type with PDF images. Also, let {i1 < i2 < images < ik, j1 < j2 < images < jl} be a subset of {1, 2, …, n}. Then

(17)images

provided that the denominator exceeds 0. Here images is the joint marginal PDF of images. The conditional densities are obtained in a similar manner.

The case in which (X1, X2, … , Xn) is of the discrete type is similarly treated.

We conclude this section with a discussion of a technique called truncation. We consider two types of truncation each with a different objective. In probabilistic modeling we use truncated distributions when sampling from an incomplete population.

If X is a discrete RV with PMF images, the truncated distribution of X is given by

(18)images

If X is of the continuous type with PDF f, then

(19)images

The PDF of the truncated distribution is given by

(20)images

Here T is not necessarily a bounded set of real numbers. If we write Y for the RV with distribution function images, then Y has support T.

The second type of truncation is very useful in probability limit theory specially when the DF F in question does not have a finite mean. Let images be finite real numbers. Define RV X* by

images

This method produces an RV for which images so that X* has moments of all orders. The special case when images and images is quite useful in probability limit theory when we wish to approximate X through bounded rvs. We say that Xc is X truncated at c if images for images. Then images. Moreover,

images

so that c can be selected sufficiently large to make images arbitrarily small. For example, if images then

images

and given images, we can choose c such that images.

The distribution of Xc is no longer the truncated distribution images. In fact,

images

where F is the DF of X and Fc, that of Xc.

A third type of truncation, sometimes called Winsorization, sets

images

This method also produces an RV for which images, moments of all orders for X* exist but its DF is given by

images

PROBLEMS 4.2

  1. Let images if images, if images. Does F define a DF in the plane?
  2. Let T be a closed triangle in the plane with vertices images, and images. Let F(x,y) denote the elementary area of the intersection of T with images. Show that F defines a DF in the plane, and find its marginal DFs.
  3. Let (X, Y) have the joint PDF f defined by images inside the square with corners at the points (1,0), (0,1), (1,0), and (0, −1) in the (x,y)-plane, and = 0 otherwise. Find the marginal PDFs of X and Y and the two conditional PDFs.
  4. Let images otherwise, be the joint PDF of (X, Y, Z). Compute images and P{X = Y < Z}.
  5. Let (X, Y) have the joint PDF images, and = 0 otherwise. Find images.
  6. For DFs F, F1, F2,…,Fn show that
    images

    for all real numbers x1,x2,…,xn if and only if Fi,’s are marginal DFs of F.

  7. For the bivariate negative binomial distribution
    images

    where images is an integer, images, find the marginal PMFs of X and Y and the conditional distributions.

    In Problems 8−10 the bivariate distributions considered are not unique generalizations of the corresponding univariate distributions.

  8. For the bivariate Cauchy RV (X, Y) with PDF
    images

    find the marginal PDFs of X and Y. Find the conditional PDF of Y given images.

    images

  9. For the bivariate beta RV (X, Y) with PDF
    images

    where p1 , p2, p3 are positive real numbers, find the marginal PDFs of X and Y and the conditional PDFs. Find also the conditional PDF of images, given X = x.

  10. For the bivariate gamma RV (X, Y) with PDF
    images

    find the marginal PDFs of X and Y and the conditional PDFs. Also, find the conditional PDF of images, given images, and the conditional distribution of X/Y, given images.

  11. For the bivariate hypergeometric RV (X, Y) with PMF
    images

    where images, N,n integers with images, and images so that images, find the marginal PMFs of X and Y and the conditional PMFs.

  12. Let X be an RV with PDF images if images, and = 0 otherwise. Let images. Find the PDF of the truncated distribution of X, its means, and its variance.
  13. Let X be an RV with PMF
    images

    Suppose that the value images cannot be observed. Find the PMF of the truncated RV, its mean, and its variance.

  14. Is the function
    images

    a joint density function? If so, find images, where (X, Y, Z, U) is a random variable with density f.

  15. Show that the function defined by
    images

    and 0 elsewhere is a joint density function.

    1. Find images.
    2. Find images.
  16. Let (X, Y) have joint density function f and joint distribution function F. Suppose that
    images

    holds for images and images. Show that

    images
  17. Suppose (X, Y, Z) are jointly distributed with density
    images

    Find images. Hence find the probability that images. (Here g is density function on images.)

4.3 INDEPENDENT RANDOM VARIABLES

We recall that the joint distribution of a multiple RV uniquely determines the marginal distributions of the component random variables, but, in general, knowledge of marginal distributions is not enough to determine the joint distribution. Indeed, it is quite possible to have an infinite collection of joint densities fα with given marginal densities.

In this section we deal with a very special class of distributions in which the marginal distributions uniquely determine the joint distribution of a multiple RV. First we consider the bivariate case.

Let F(x, y) and F1(x), F2(y), respectively, be the joint DF of (X, Y) and the marginal DFs of X and Y.

Note that Φ(X2) and ψ(Y2) are independent where Φ and ψ are Borel–measurable functions. But X is not a Borel-measurable function of X2.

It is clear that an analog of Theorem 1 holds, but we leave the reader to construct it.

The following result is easy to prove.

Remark 1. It is quite possible for RVs X1, X2,…Xn to be pairwise independent without being mutually independent. Let (X, Y, Z) have the joint PMF defined by

images

Clearly, X, Y, Z are not independent (why?). We have

images

It follows that X and Y, Y and Z, and X and Z are pairwise independent.

Similarly, one can speak of an independent family of RVs.

According to Definition 4, X and Y are identically distributed if and only if they have the same distribution. It does not follow that images with probability 1 (see Problem 7). If images, we say that X and Y are equivalent RVs. All Definition 4 says is that X and Y are identically distributed if and only if

images

Nothing is said about the equality of events images and images.

Of course, the independence of (X1, X2,… ,Xm) and (Y1, Y2,… ,Yn) does not imply the independence of components X1, X2,… ,Xm of X or components Y1, Y2,,Yn of Y.

Remark 2. It is possible that an RV X may be independent of Y and also of Z, but X may not be independent of the random vector (Y,Z). See the example in Remark 1.

Let X1, X2,… ,Xn be independent and identically distributed RVs with common DF F. Then the joint DF G of (X1,X2,… ,Xn) is given by

images

We note that for any of the n! permutations images of (x1, x2,…,xn)

images

so that G is a symmetric function of x1, x2,… ,xn. Thus images, where images means that X and Y are identically distributed RVs.

Clearly if X1, X2,…, Xn are exchangeable, then Xi are identically distributed but not necessarily independent.

In view of Theorem 6, Xs is symmetric about 0 so that

images

If images,then images, and EXs = 0.

The technique of symmetrization is an important tool in the study of probability limit theorems. We will need the following result later. The proof is left to the reader.

PROBLEMS 4.3

  1. Let A be a set of k numbers, and Ω be the set of all ordered samples of size n from A with replacement. Also, let images be the set of all subsets of Ω, and P be a probability defined on images. Let X1, X2,…, Xn be RVs defined on (Ω, images, P) by setting
    images

    Show that X1, X2,… ,Xn are independent if and only if each sample point is equally likely.

  2. Let X1, X2 be iid RVs with common PMF
    images

    Write images. Show that X1, X2, X3 are pairwise independent but not independent.

  3. Let (X1, X2, X3) be an RV with joint PMF
    images

    where

    images

    Are X1,X2,X3 independent? Are X1,X2,X3 pairwise independent? Are images and X3 independent?

    No; Yes; No.

  4. Let X and Y be independent RVs such that XY is degenerate at images. That is, images. Show that X and Y are also degenerate.
  5. Let (Ω, images, P) be a probability space and A, B ∈ images. Define X and Y so that
    images

    Show that X and Y are independent if and only if A and B are independent.

  6. Let X1,X2,… ,Xn be a set of exchangeable RVs. Then
    images
  7. Let X and Y be identically distributed. Construct an example to show that X and Y need not be equal, that is, images need not equal 1.
  8. Prove Lemma 1.
  9. Let X1,X2,… ,Xn be RVs with joint PDF f, and let fj be the marginal PDF of images. Show that X1, X2,… ,Xn are independent if and only if
    images
  10. Suppose two buses, A and B, operate on a route. A person arrives at a certain bus stop on this route at time 0. Let X and Y be the arrival times of buses A and B, respectively, at this bus stop. Suppose X and Y are independent and have density functions given, respectively, by
    images

    What is the probability that bus A will arrive before bus B?

  11. Consider two batteries, one of Brand A and the other of Brand B. Brand A batteries have a length of life with density function
    images

    whereas Brand B batteries have a length of life with density function given by

    images

    Brand A and Brand B batteries operate independently and are put to a test. What is the probability that Brand B battery will outlast Brand A? In particular, what is the probability if images?

    1. Let (X, Y) have joint density f. Show that X and Y are independent if and only if for some constant images and nonnegative functions f1 and f2
      images
      for all images.
    2. Let images, and fX, fY are marginal densities of X and Y, respectively. Show that if X and Y are independent then images.
  12. If Φ is the CF of X, show that the CF of Xs is real and even.
  13. Let X, Y be jointly distributed with PDF images otherwise. Show that images and images has a symmetric distribution.

4.4 FUNCTIONS OF SEVERAL RANDOM VARIABLES

Let X1, X2,… ,Xn be RVs defined on a probability space (Ω, images, P). In practice we deal with functions of X1, X2,… , Xn such as images, min (X1,… ,Xn), and so on. Are these also RVs? If so, how do we compute their distribution given the joint distribution of X1, X2 …, Xn?

What functions of (X1, X2,… ,Xn) are RVs?

In particular, if g: images is a continuous function, then g(X1,X2,… ,Xn) is an RV.

How do we compute the distribution of g(X1,X2,… ,Xn)? There are several ways to go about it. We first consider the method of distribution functions. Suppose that images is real-valued, and let images. Then

images

where in the continuous case f is the joint PDF of (X1,…, Xn).

In the continuous case we can obtain the PDF of images by differentiating the DF images with respect to y provided that Y is also of the continuous type. In the discrete case it is easier to compute images.

We take a few examples,

c4-fig-0005
c4-fig-0005

Fig. 1 (a) images and (b) images.

There are two cases to consider according to whether images or images (Fig. 1a and 1b). In the former case,

images

and in the latter case,

images

Hence the density function of Y is given by

images

The method of distribution functions can also be used in the case when g takes values in imagesm, 1 ≤ m ≤ n, but the integration becomes more involved.

Let images and images. Then the joint distribution of (Y1, Y2) is given by

images

where images Clearly, images so that the set A is as shown in Fig. 2. It follows that

c4-fig-0006

Fig. 2 images.

images

Hence the joint density of Y1, Y2 is given by

images

The marginal densities of Y1, Y2 are easily obtained as

images

We next consider the method of transformations. Let (X1,…, Xn) be jointly distributed with continuous PDF f(x1, x2,…,xn), and let images, where

images

be a mapping of imagesn to Rn. Then

images

where images. Let us choose B to be the n-dimensional interval

images

Then the joint DF of Y is given by

images

and (if GY is absolutely continuous) the PDF of Y is given by

images

at every continuity point y of w. Under certain conditions it is possible to write w in terms of f by making a change of variable in the multiple integral.

Then (Y1, Y2,… ,Yn) has a joint absolutely continuous DF with PDF given by

Proof. For (y1, y2,…,yn) ∈ imagesn, let

images

Then

images

and

images

Result (1) now follows on differentiation of DF GY.

Remark 1. In actual applications we will not know the mapping from x1, x2,…,xn to y1, y2,…,yn completely, but one or more of the functions gi will be known. If only images, of the gi’s are known, we introduce arbitrarily images functions such that the conditions of the theorem are satisfied. To find the joint marginal density of these k variables we simply integrate the w function over all the images variables that were arbitrarily introduced.

Remark 2. An analog of Theorem 2.5.4 holds, which we state without proof.

Let images be an RV of the continuous type with joint PDF f, and let images, be a mapping of imagesn into itself. Suppose that for each y the transformation g has a finite number images of inverses. Suppose further that imagesn can be partitioned into k disjoint sets A1, A2,… ,Ak, such that the transformation g from images into imagesn is one-to-one with inverse transformation

images

Suppose that the first partial derivatives are continuous and that each Jacobian

images

is different from 0 in the range of the transformation. Then the joint PDF of Y is given by

images
c4-fig-0007

Fig. 3 images.

In Example 6 the transformation used is orthogonal and is known as Helmert’s transformation. In fact, we will show in Section 6.5 that under orthogonal transformations iid RVs with PDF f defined above are transformed into iid RVs with the same PDF.

It is easily verified that

images

We have therefore proved that images is independent of images This is a very important result in mathematical statistics, and we will return to it in Section 7.7.

An important application of the result in Remark 2 will appear in Theorem 4.7.2.

Finally, we consider a technique based on MGF or CF which can be used in certain situations to determine the distribution of a function g(X1, X2,… ,Xn) of X1, X2,… ,Xn.

Let (X1, X2,… ,Xn) be an n-variate RV, and g be a Borel-measurable function from imagesn to images1.

Let images, and let h(y) be its PDF. If images then

images

An analog of Theorem 3.2.1 holds. That is,

images

in the sense that if either integral exists so does the other and the two are equal. The result also holds in the discrete case.

Some special functions of interest are images, where k1, k2,… ,kn are non-negative integers, images, where t1,t2,… ,tn are real numbers, and images, where images.

We will mostly deal with MGF even though the condition that it exist for images restricts its application considerably. The multivariate MGF (CF) has properties similar to the univariate MGF discussed earlier. We state some of these without proof. For notational convenience we restrict ourselves to the bivariate case.

A formal definition of moments in the multivariate case will be given in Section 4.5.

The MGF technique uses the uniqueness property of Theorem 4. In order to find the distribution (DF, PDF, or PMF) of images we compute the MGF of Y using definition. If this MGF is one of the known kind then Y must have this kind of distribution. Although the technique applies to the case when Y is an m-dimensional RV, images, we will mostly use it for the images case.

The following result has many applications as we will see. Example 9 is a special case.

From these examples it is clear that to use this technique effectively one must be able to recognize the MGF of the function under consideration. In Chapter 5 we will study a number of commonly occurring probability distributions and derive their MGFs (whenever they exist). We will have occasion to use Theorem 7 quite frequently.

For integer-valued RVs one can sometimes use PGFs to compute the distribution of certain functions of a multiple RV.

We emphasize the fact that a CF always exists and analogs of Theorems 4–7 can be stated in terms of CFs.

PROBLEMS 4.4

  1. Let F be a DF and ε be a positive real number. Show that
    images

    and

    images

    are also distribution functions.

  2. Let X, Y be iid RVs with common PDF
    images
    1. Find the PDF of RVs images, X – Y, XY, X/Y, min{X, Y}, max{X, Y}, min{X, Y}/max{X, Y}, and images
    2. Let images and images. Find the conditional PDF of V, given images, for some fixed images.
    3. Show that U and images are independent.
  3. Let X and Y be independent RVs defined on the space (Ω, images, P). Let X be uniformly distributed on (–a, a), images, and Y be an RV of the continuous type with density f, where f is continuous and positive on images. Let F be the DF of Y. If u0 ∈ (–a, a) is a fixed number, show that
    images

    where images is the conditional density function of Y, given. images.

  4. Let X and Y be iid RVs with common PDF
    images

    Find the PDFs of RVs XY, X/Y, min {X, Y}, max {X, Y}, min {X, Y}/max {X, Y}.

  5. Let X1, X2, X3 be iid RVs with common density function
    images

    Show that the PDF of images is given by

    images

    An extension to the n-variate case holds.

  6. Let X and Y be independent RVs with common geometric PMF
    images

    Also, let images. Find the joint distribution of M and X, the marginal distribution of M, and the conditional distribution of X, given M.

  7. Let X be a nonnegative RV of the continuous type. The integral part, Y, of X is distributed with PMF images; and the fractional part, Z, of X has PDF, images if images, and = 0 otherwise. Find the PDF of X, assuming that Y and Z are independent.
  8. Let X and Y be independent RVs. If at least one of X and Y is of the continuous type, show that images is also continuous. What if X and Y are not independent?
  9. Let X and Y be independent integral RVs. Show that
    images

    where P, PX, and PY, respectively, are the PGFs of images, X, and Y.

  10. Let X and Y be independent nonnegative RVs of the continuous type with PDFs f and g, respectively. Let images, and images if images and let g be arbitrary. Show that the MGF M (t) of Y, which is assumed to exist, has the property that the DF of X/Y is images.
  11. Let X, Y, Z have the joint PDF
    images

    Find the PDF of images.

    images.

  12. Let X and Y be iid RVs with common PDF
    images

    Find the PDF of images.

  13. Let X and Y be iid RVs with common PDF f defined in Example 8. Find the joint PDF of U and V in the following cases:
    1. images, images, images
    2. images, images
  14. Construct an example to show that even when the MGF of images can be written as a product of the MGF of X and the MGF of Y, X and Y need not be independent.
  15. Let X1, X2,…, Xn be iid with common PDF
    images

    Using the distribution function technique show that

    1. The joint PDF of images, and images is given by
      images
      and = 0 otherwise.
    2. The PDF of X(n) is given by
      images
      and that of X(1) by
      images
  16. Let X1, X2 be iid with common Poisson PMF
    images

    where images is a constant. Let images and images. Find the PMF of X(2).

  17. Let X have the binomial PMF
    images

    Let Y be independent of X and images. Find PMF OF images and images.

4.5 COVARIANCE, CORRELATION AND MOMENTS

Let X and Y be jointly distributed on (Ω, images, P). In Section 4.4 we defined Eg (X, Y) for Borel functions g on images2. Functions of the form images where j and k are nonnegative integers are of interest in probability and statistics.

Recall (Theorem 3.2.8) that images is minimized when we choose images so that EY may be interpreted as the best constant predictor of Y. If instead, we choose to predict Y by a linear function of X, say images, and measure the error in this prediction by images, then we should choose a and b to minimize this so-called mean square error. Clearly, images is minimized, for any a, by choosing images. With this choice of b, we find a such that

images

is minimum. An easy computation shows that the minimum occurs if we choose

Provided images. Moreover,

Let us write

Then (8) shows that predicting Y by a linear function of X reduces the prediction error from images to images. We may therefore think of ρ as a measure of the linear dependence between RVs X and Y.

If X and Y are independent, then from (5) images, and X and Y are uncorrelated. If, however, images then X and Y may not necessarily be independent.

Let us now study some properties of the correlation coefficient. From the definition we see that ρ (and also cov (X, Y)) is symmetric in X and Y.

Equality in (11) holds if and only if images, or equivalently, images holds. This implies and is implied by images. Here a ≠ 0.

Remark 1. From (7) and (9) we note that the signs of a and ρ are the same so if images then images where a > 0, and if images then images.

The existence of ES follows easily by replacing each aj by |aj| and each xij by |xij| and remembering that images. The case of continuous type (X1, X2,…,Xn) is similarly treated.

Let X and Y be independent, and g1 (∙) and g2 (∙) be Borel-measurable functions. Then we know (Theorem 4.3.3) that g1 (X) and g2 (Y) are independent. If E{g1(X)}, E{g2(Y)}, and E{g1 (X)g2(Y)} exist, it follows from Theorem 4 that

Conversely, if for any Borel sets A1 and A2 we take images if XA1, and = 0 otherwise, and images if images, and = 0 otherwise, then

images

and images, images Relation (14) implies that for any Borel sets A1 and A2 of real numbers

images

It follows that X and Y are independent if (14) holds. We have thus proved the following theorem.

Note that the result holds if we replace independence by the condition that Xi’s are exchangeable and uncorrelated.

We conclude this section with some important moment inequalities. We begin with the simple inequality

where images for images and images for r > 1. For images and images, (20) is trivially true.

First note that it is sufficient to prove (20) when images. Let images, and write images. Then

images

Writing images, we see that

images

where images. It follows that. images, and < 0 if r < 1. Thus

images

while

images

Note that images is trivially true since

images

images An immediate application of (20) is the following result.

Corollary. Taking images, we obtain the Cauchy–Schwarz inequality,

images

The final result of this section is an inequality due to Minkowski.

PROBLEMS 4.5

  1. Suppose that the RV (X, Y) is uniformly distributed over the region images Find the covariance between X and Y.
  2. Let (X, Y) have the joint PDF given by
    images

    Find all moments of order 2.

  3. Let (X, Y) be distributed with joint density
    images

    Find the MGF of (X, Y). Are X, Y independent? If not, find the covariance between X and Y.

    images dependent.

  4. For a positive RV X with finite first moment show that (1) images and (2) images.
  5. If X is a nondegenerate RV with finite expectation and such that images, then
    images

    (Kruskal [56])

  6. Show that for images
    images

    and hence that

    images
  7. Given a PDF f that is nondecreasing in the interval images, show that for any s>0
    images

    with the inequality reversed if f is nonincreasing.

  8. Derive the Lyapunov inequality (Theorem 3.4.3)
    images

    from Hölder’s inequality (22).

  9. Let X be an RV with images for images. Show that the function log E|X|r is a convex function of r.
  10. Show with the help of an example that Theorem 9 is not true for images
  11. Show that the converse of Theorem 8 also holds for independent RVs, that is, if images for some images and X and Y are independent, then images.

    [Hint: Without loss of generality assume that the median of both X and Y is 0. Show that, for any images, images. Now use the remarks preceding Lemma 3.2.2 to conclude that images.]

  12. Let (Ω, images, P) be a probability space, and A1, A2,…,An be events in images such that images. Show that
    images

    (Chung and Erdös [14])

    [Hint: Let Xk be the indicator function of Ak, images. Use the Cauchy–Schwarz inequality.]

  13. Let (Ω, images, P) be a probability space, and A,B, ∈ images with images, images. Define ρ(A, B) by ρ(A, B) = correlation coefficient between RVs IA, and IB, where IA, IB, are the indicator functions of A and B, respectively. Express ρ(A, B) in terms of PA, PB, and P(AB) and conclude that images if and only if A and B are independent. What happens if images or if images?
    1. Show that
      images
      and
      images
    2. Show that
      images
  14. Let X1, X2,…,Xn be iid RVs and define
    images

    Suppose that the common distribution is symmetric. Assuming the existence of moments of appropriate order, show that images.

  15. Let X,Y be iid RVs with common standard normal density
    images

    Let images and images. Find the MGF of the rendom variable (U, V). Also, find the correlation coefficient between U and V. Are U and V independent?

  16. Let X and Y be two discrete RVs:
    images

    and

    images

    Show that X and Y are independent if and only if the correlation coefficient between X and Y is 0.

  17. Let X and Y be dependent RVs with common means 0, variance 1, and correlation coefficient ρ. Show that
    images
  18. Let X1, X2 be independent normal RVs with density functions
    images

    Also let

    images

    Find the correlation coefficient between Z and W and show that

    images

    where ρ denotes the correlation coefficient between Z and W.

  19. Let (X1, X2,…,Xn) be an RV such that the correlation coefficient between each pair Xi, Xj, images, is ρ. Show that images.
  20. Let X1, X2,…,Xm+n be iid RVs with finite second moment. Let images. Find the correlation coefficient between Sn and images, where images.
  21. Let f be the PDF of a positive RV, and write
    images

    Show that g is a density function in the plane. If the mth moment of f exists for some positive integer m, find EXm. Compute the means and variances of X and Y and the correlation coefficient between X and Y in terms of moments of f. (Adapted from Feller [26, p. 100].)

    If U has PDF f, then images for images; images.

  22. A die is thrown images times. After each throw a + sign is recorded for 4, 5, or 6, and a – sign for 1, 2, or 3, the signs forming an ordered sequence. Each sign, except the first and the last, is attached to a characteristic RV that assumes the value 1 if both the neighboring signs differ from the one between them and 0 otherwise. Let X1,X2,…,Xn be these characteristic RVs, where Xi corresponds to the images st sign (i = 1, 2,…,n) in the sequence. Show that
    images
  23. Let (X, Y) be jointly distributed with PDF f defined by images inside the square with corners at the points (0, 1), (1, 0), (–1, 0), (0,–1) in the (x, y)-plane, and images otherwise. Are X, Y independent? Are they uncorrelated?

4.6 CONDITIONAL EXPECTATION

In Section 4.2 we defined the conditional distribution of an RV X, given Y. We showed that, if (X, Y) is of the discrete type, the conditional PMF of X, given images, where images, is a PMF when considered as a function of the xi’s (for fixed yj). Similarly, if (X, Y) is an RV of the continuous type with PDF f (x,y) and marginal densitiesf1 and f2, respectively, then, at every point (x, y) at which f is continuous and at which images and is continuous, a conditional density function of X, given Y, exists and may be defined by

images

We also showed that images, for fixed y, when considered as a function of x is a PDF in its own right. Therefore, we can (and do) consider the moments of this conditional distribution.

Needless to say, a similar definition may be given for the conditional expectation images.

It is immediate that images satisfies the usual properties of an expectation provided we remember that images is not a constant but an RV. The following results are easy to prove. We assume existence of indicated expectations.

(2)images

for any Borel functions g1, g2.

The statements in (3), (4), and (5) should be understood to hold with probability 1.

(6)images

for independent RVs X and Y.

If ϕ(X, Y) is a function of X and Y, then

for any Borel functions ψ and Φ.

Again (8) should be understood as holding with probability 1. Relation (7) is useful as a computational device. See Example 3 below.

The moments of a conditional distribution are defined in the usual manner. Thus, for images, images defines the rth moment of the conditional distribution. We can define the central moments of the conditional distribution and, in particular, the variance. There is no difficulty in generalizing these concepts for n-dimensional distributions when images. We leave the reader to furnish the details.

Theorem 1 is quite useful in computation of Eh(X) in many applications.

Equation (11) follows immediately from (10). The equality in (11) holds if and only if

images

which holds if and only if with probability 1

(12)images

PROBLEMS 4.6

  1. Let X be and RV with PDF given by
    images

    Find images, where a and b are constants.

    images where Φ is the standard normal DF.

    1. Let (X, Y) be jointly distributed with density
      images
      Find E{Y | X}.
    2. Do the same for the joint density
      images
  2. Let (X, Y) be jointly distributed with bivariate normal density
    images

    Find images and images. (Here, images, and images.)

  3. Find images.
  4. Show that images is minimized by choosing images.
  5. Let X have PMF
    images

    and suppose that λ is a realization of a RV Λ with PDF

    images

    Find images.

  6. Find E(XY) by conditioning on X or Y for the following cases:
    1. images.
    2. images.
  7. Suppose X has uniform PDF images, images and 0 otherwise Let Y be chosen from interval (0, X] according to PDF
    images

    Find images and EYk for any fixed constant images.

4.7 ORDER STATISTICS AND THEIR DISTRIBUTIONS

Let (X1, X2,… ,Xn) be an n-dimensional random variable and (x1, x2,,xn) be an n-tuple assumed by (X1, X2,,Xn). Arrange (x1, x2,… ,xn) in increasing order of magnitude so that

images

where images, x(2) is the second smallest value in x1, x2,… ,xn, and so on, images. If any two xi, xj are equal, their order does not matter.

Statistical considerations such as sufficiency, completeness, invariance, and ancillarity (Chapter 8) lead to the consideration of order statistics in problems of statistical inference. Order statistics are particularly useful in nonparametric statistics (Chapter 13) where, for example, many test procedures are based on ranks of observations. Many of these methods require the distribution of the ordered observations which we now study.

In the following we assume that X1, X2,… , Xn are iid RVs. In the discrete case there is no magic formula to compute the distribution of any X(j) or any of the joint distributions. A direct computation is the best course of action.

In the following we assume that X1, X2 ,… ,Xn are iid RVs of the continuous type with PDF f. Let {X(1), X(2),… ,X(n)} be the set of order statistics for X1, X2 ,… ,Xn. Since the Xi are all continuous type RVs, it follows with probability 1 that

images

It follows (see Remark 2) that

images

The procedure for computing the marginal PDF of X(r), the rth-order statistic of X1, X2,… ,Xn is similar. The following theorem summarizes the result.

We now compute the joint PDF of X(j) and X(k) images.

In a similar manner we can show that the joint PDF of images, is given by

images

for y1 < y2 < images <yk, and = 0 otherwise.

The joint PDF of X(1) and X(n) is given by

images

and that of the range images by

images

Finally, we consider the moments, namely, the means, variances, and covariances of order statistics. Suppose X1, X2,…Xn are iid RVs with common DF F. Let g be a Borel function on images such that images, where X has DF F. Then for images

images

and we write

images

for images. The converse also holds. Suppose images for images. Then,

images

for images and hence

images

Moreover, it also follows that

images

As a consequence of the above remarks we note that if images for some r, images, then images and conversely, if images for some r, images.

PROBLEMS 4.7

  1. Let X(1), X(2),…X(n) be the set of order statistics of independent RVs X1, X2,…,Xn with common PDF
    images
    1. Show that X(r) and X(s) – X(r) are independent for any images.
    2. Find the PDF of images.
    3. Let images. Show that (Z1, Z2,…,Zn) and (X1, X2,…,Xn) are identically distributed.
  2. Let X1, X2,…Xn be iid from PMF
    images

    Find the marginal distributions of X(1), X(n), and their joint PMF.

  3. Let X1, X2,…,Xn be iid with a DF
    images

    Show that X(i)/X(n), images, and X(n) are independent.

  4. Let X1, X2,…,Xn be iid RVs with common Pareto DF images, images where images, images Show that
    1. X(1), (X(2)/X(1), …, X(n)/X(1)) are independent,
    2. X(1) has Pareto (σ, ) distribution, and
    3. images has PDF
      images
  5. Let X1, X2,…,Xn be iid nonnegative RVs of the continuous type. If images, show that images. Write images. Show that
    images

    Find EMn in each of the following cases:

    1. Xi have the common DF
      images
    2. Xi have the common DF
      images
  6. Let X1, X(2)…,X(n) be that order statistics of n independent RVs X1, X2,…,Xn with common PDF images if images, and = 0 otherwise. Show that images, and images are independent. Find the PDf of Y1, Y2,…,Yn.
  7. For the PDF in Problem 4 find EX(r).
  8. An urn contains N identical marbles numbered 1 through N. From the urn n marbles are drawn and let X(n) be the largest number drawn. Show that images, and images.
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.145.66.94