7
BASIC ASYMPTOTICS: LARGE SAMPLE THEORY

7.1 INTRODUCTION

In Chapter 6 we described some methods of finding exact distributions of sample statistics and their moments. While these methods are used in some cases such as sampling from a normal population when the sample statistic of interest is images or S2, often either the statistics of interest, say images, is either too complicated or its exact distribution is not simple to work with. In such cases we are interested in the convergence properties of Tn. We want to know what happens when the sample size is large. What is the limiting distribution of Tn? When the exact distribution of Tn (and its moments) is unknown or too complicated we will often use their asymptotic approximations when n is large.

In this chapter, we discuss some basic elements of statistical asymptotics. In Section 7.2 we discuss various modes of convergence of a sequence of random variables. In Sections 7.3 and 7.4 the laws of large numbers are discussed. Section 7.5 deals with limiting moment generating functions and in Section 7.6 we discuss one of the most fundamental theorem of classical statistics called the central limit theorem. In Section 7.7 we consider some statistical applications of these methods.

The reader may find some parts of this chapter a bit difficult on first reading. Such a discussion has been indicated with a.

7.2 MODES OF CONVERGENCE

In this section we consider several modes of convergence and investigate their interrelationships. We begin with the weakest mode of convergence.

It must be remembered that it is quite possible for a given sequence DFs to converge to a function that is not a DF.

We next give an example to show that weak convergence of distribution functions does not imply the convergence of corresponding PMF’s or PDF’s.

The following result is easy to prove.

In the continuous case we state the following result of Scheffé [100] without proof.

The following result is easy to establish.

A slightly stronger concept of convergence is defined by convergence in probability.

Remark 1. We emphasize that the definition says nothing about the convergence of the RVs Xn to the RV X in the sense in which it is understood in real analysis. Thus images does not imply that, given images, we can find an N such that images. Definition 2 speaks only of the convergence of the sequence of probabilities images to 0.

The following statements can be verified.

  1. images.
  2. images, for images, and it follows that images for every images.
  3. images as images, for
    images
  4. images.
  5. images.
  6. images.
  7. images, for
    images
  8. images, for
    images
    and each of the three terms on the right goes to 0 as images.
  9. images.
  10. images and Y an images.

    Note that Y is an RV so that, given images, there exists a images such that images. Thus

    images
  11. images, for
    images
    The result now follows on multiplication, using result 10. It also follows that images.

We remark that a more general result than Theorem 4 is true and state it without proof (see Rao [88, p. 124]): images and g continuous on images.

The following two theorems explain the relationship between weak convergence and convergence in probability.

Remark 2. We emphasize that we cannot improve the above result by replacing k by an RV, that is, images in general does not imply images, for let X, X1, X2… be identically distributed RVs, and let the joint distribution of (Xn, X) be as follows:

c07g001

Clearly, images. But

images

Hence, images, but images.

Remark 3. Example 3 shows that images does not imply images for any images, k integral.

We get, in addition, that images implies images.

Proof. The proof is left to the reader.

As a simple consequence of Theorem 8 and its corollary we see that images together imply images.

Remark 4. Clearly the converse to Theorem 10 cannot hold, since images does not imply images.

Remark 5. In view of Theorem 9, it follows that images for images.

The following result elucidates Definition 4.

Remark 6. Thus images means that, for images arbitrary, we can find an n0 such that

(5)images

Indeed, we can write, equivalently, that

That the converse of Theorem 12 does not hold is shown in the following example.

Remark 7. In Theorem 7.4.3 we prove a result which is sometimes useful in proving a.s. convergence of a sequence of RVs.

Since images is arbitrary and x is a continuity point of images, we get the result by letting images.

Later on we will see that the condition that the Xi’s be images(0, 1) is not needed. All we need is that images.

PROBLEMS 7.2

  1. Let X1, X2, be a sequence of RVs with corresponding DFs given by images if images. Does Fn converge to a DF?
  2. Let X1, X2 be iid images(0, 1) RVs. Consider the sequence of RVs images, where images. Let Fn be the DF of images., Find images. Is this limit a DF?
  3. Let X1, X2,… be iid U(0, θ) RVs. Let images, and consider the sequence images. Does Yn converge in distribution to some RV Y? If so, find the DF of RV Y.
  4. Let X1, X2 be iid RVs with common absolutely continuous DF F. Let images, and consider the sequence of images. Find the limiting DF of Yn.
  5. Let X1, X2,… be a sequence of iid RVs with common PDF images. Write images.
    1. Show that images.
    2. Show that images.
  6. Let X1, X2,… be iid U[0, θ] RVs. Show that images.
  7. Let {Xn} be a sequence of RVs such that images. Let an be a sequence of positive constants such that images. Show that images.
  8. Let {Xn} be a sequence of RVs such that images for all n and some constant images. Suppose that images. Show that images for any images.
  9. Let X1, X2,…, X2n be iid images(0, 1) RVs. Define
    images

    Find the limiting distribution of Zn.

  10. Let {Xn} be a sequence of geometric RVs with parameter images. Also, let images. Show that images as images

    (Prochaska [82]).

  11. Let Xn be a sequence of RVs such that images, and let cn be a sequence of real numbers such that images as images. Show that images.
  12. Does convergence almost surely imply convergence of moments?
  13. Let X1, X2,… be a sequence of iid RVs with common DF F, and write images.
    1. For images, images. Find the limiting distribution of images. Also, find the PDF corresponding to the limiting DF and compute its moments
    2. If F satisfies
      images
      find the limiting DF of images and compute the corresponding PDF and the MGF.
    3. If Xi is bounded above by x0 with probability 1, and for some images
      images
      find the limiting distribution of images, the corresponding PDF, and the moments of the limiting distribution.

    (The above remarkable result, due to Gnedenko [36], exhausts all limiting distributions of X(n) with suitable norming and centering.)

  14. Let {Fn} be a sequence of DFs that converges weakly to a DF F which is continuous everywhere. Show that Fn(x) converges to F(x) uniformly.
  15. Prove Theorem 1.
  16. Prove Theorem 6.
  17. Prove Theorem 13.
  18. Prove Corollary 1 to Theorem 8.
  19. Let V be the class of all random variables defined on a probability space with finite expectations, and for images define
    images

    Show the following:

    1. images.
    2. images is a distance function on V (assuming that we identify RVs that are a.s. equal).
    3. images.
  20. For the following sequences of RVs {Xn}, investigate convergence in probability and convergence in rth mean.
    1. images.
    2. images.

7.3 WEAK LAW OF LARGE NUMBERS

Let {Xn} be a sequence of RVs. Write, images. In this section we answer the following question in the affirmative: Do there exist sequences of constants An and images, such that the sequence of RVs images converges in probability to 0 as images?

Remark 1. Since condition (1) applies not to the individual variables but to their sum, Theorem 2 is of limited use. We note, however, that all weak laws of large numbers obtained as corollaries to Theorem 1 follow easily from Theorem 2 (Problem 6).

Let X1, X2,… be an arbitrary sequence of RVs, and let images. Let us truncate each Xi at images, that is, let

images

Write

images

Inequality (6) yields the following important theorem.

We emphasize that in Theorem 3 we require only that images; nothing is said about the variance. Theorem 3 is due to Khintchine.

PROBLEMS 7.3

  1. Let X1, X2,… be a sequence of iid RVs with common uniform distribution on [0, 1]. Also, let images be the geometric mean of X1, X2,…,Xn, images. Show that images, where c is some constant. Find c.
  2. Let X1, X2,… be iid RVs with finite second moment. Let
    images

    Show that images.

  3. Let X1, X2,… be a sequence of iid RVs with images and images. Let images. Does the sequence Sk obey the WLLN in the sense of Definition 1? If so, find the centering and the norming constants.

    Yes; images

  4. Let {Xn} be a sequence of RVs for which images for all n and images. Show that the WLLN holds.
  5. For the following sequences of independent RVs does the WLLN hold?
    1. images.
    2. images.
    3. images.
    4. images.
    5. images.
  6. Let X1, X2, be a sequence of independent RVs such that images for images, and images as images. Prove the WLLN, using Theorem 2.
  7. Let Xn be a sequence of RVs with common finite variance σ2. Suppose that the correlation coefficient between Xi and Xj is < 0 for all images. Show that the WLLN holds for the sequence {Xn}.
  8. Let {Xn} be a sequence of RVs such that Xk is independent of Xj for images or images. If images for all k, where C is some constant, the WLLN holds for {Xk}.
  9. For any sequence of RVs {Xn} show that
    images
  10. Let X1, X2,… be iid images(1, 0) RVs. Use Theorem 2 to show that the weak law of large numbers does not hold. That is, show that
    images
  11. Let {Xn} be a sequence of iid RVs with images. Let images. Suppose {an} is a sequence of constants such that images. Show that
    1. images as images and (b) images.

7.4 STRONG LAW OF LARGE NUMBERS

In this section we obtain a stronger form of the law of large numbers discussed in Section 7.3. Let X1, X2,… be a sequence of RVs defined on some probability space (Ω, images, P).

We will obtain sufficient conditions for a sequence {Xn} to obey the SLLN. In what follows, we will be interested mainly in the case images. Indeed, when we speak of the SLLN we will assume that we are speaking of the norming constants images, unless specified otherwise.

We start with the Borel-Cantelli lemma. Let {Aj} be any sequence of events in images. We recall that

(2)images

We will write images. Note that A is the event that infinitely many of the An occur. We will sometimes write

images

where “i.o.” stands for “infinitely often.” In view of Theorem 7.2.11 and Remark 7.2.6 we have images if and only if images for all images.

We next prove some important lemmas that we will need subsequently.

As a corollary we get a version of the SLLN for nonidentically distributed RVs which subsumes Theorem 2.

Remark 1. Kolmogorov’s SLLN is much stronger than Corollaries 1 and 4 to Theorem 4. It states that if {Xn} is a sequence of iid RVs then

images

and then images. The proof requires more work and will not be given here. We refer the reader to Billingsley [6], Chung [15], Feller [26], or Laha and Rohatgi [58].

PROBLEMS 7.4

  1. For the following sequences of independent RVs does the SLLN hold?
    1. images.
    2. images.
    3. images.
  2. Let X1, X2,… be a sequence of independent RVs with images. Show that
    images

    Does the converse also hold?

  3. For what values of α does the SLLN hold for the sequence
    images
  4. Let images be a sequence of real numbers such that images. Show that there exists a sequence of independent RVs {Xk} with images, such that images does not converge to 0 almost surely.

    [Hint: Let images, and images. Apply the Borel-Cantelli lemma to images.]

  5. Let Xn be a sequence of iid RVs with images. Show that, for every positive number images and images.
  6. Construct an example to show that the converse of Theorem 1(a) does not hold.
  7. Investigate a.s. convergence of {Xn} to 0 in each case.
    1. images.
    2. images.

    (Xn’s are independent in each case.)

7.5 LIMITING MOMENT GENERATING FUNCTIONS

Let X1,X2,… be a sequence of RVs. Let Fn be the DF of images, and suppose that the MGF Mn(t) of Fn exists. What happens to Mn(t) as images? If it converges, does it always converge to an MGF?

Next suppose that Xn has MGF Mn and images, where X is an RV with MGF M. Does images? The answer to this question is in the negative.

The following result is a weaker version of the continuity theorem due to Lévy and Cramér. We refer the reader to Lukacs [69, p. 47], or Curtiss [19], for details of the proof.

Remark 1. The following notation on orders of magnitude is quite useful. We write images if, given images, there exists an N such that images for all images and images if there exists an N and a constant images, such that |xn/rn| for all images. We write images to express the fact that xn is bounded for large n, and images to mean that images as images.

This notation is extended to RVs in an obvious manner. Thus images if, for every images and images, there exists an N such that images for images, and images if, for images, there exists a images and an N such that images. We write images to mean images. This notation can be easily extended to the case where rn itself is an RV.

The following lemma is quite useful in applications of Theorem 1.

For more examples see Section 7.6.

Remark 2. As pointed out earlier working with MGFs has the disadvantage that the existence of MGFs is a very strong condition. Working with CFs which always exist, on the other hand, permits a much wider application of the continuity theorem. Let ϕn be the CF of Fn. Then images if and only if images as images on images, where ϕ is continuous at images. In this case ϕ, the limit function, is the CF of the limit DF F.

PROBLEMS 7.5

  1. Let images. Show that
    images

    here images.

  2. Let images. Show that images as images, in such a way that images, where images.
  3. Let X1, X2… be independent RVs with PMF given by images. Let images. Show that images, where images.
  4. Let {Xn} be a sequence of RVs with images where images is a constant (independent of n). Find the limiting distribution of Xn/n.
  5. Let images. Find the limiting distribution of Xn/n2.
  6. Let X1, X2,…, Xn be jointly normal with images for all i and images. What is the limiting distribution of n— 1Sn, where images

7.6 CENTRAL LIMIT THEOREM

Let X1, X2,… be a sequence of RVs, and let images. In Sections 7.3 and 7.4 we investigated the convergence of the sequence of RVs images to the degenerate RV. In this section we examine the convergence of images to a nondegenerate RV. Suppose that, for a suitable choice of constants An and images, the RVs images. What are the properties of this limit RV Y? The question as posed is far too general and is not of much interest unless the RVs Xi are suitably restricted. For example, if we take X1 with DF F and X2, X3,… to be 0 with probability 1, choosing images and images leads to F as the limit DF.

We recall (Example 7.5.6) that, if X1, X2,…, Xn are iid RVs with common law images(1, 0), then images is also images(1,0). Again, if X1, X2,…, Xn are iid images(0, 1) RVs then images is also images(0, 1) (Corollary 2 to Theorem 5.3.22). We note thus that for certain sequences of RVs there exist sequences An and images, such that images. In the Cauchy case images, images, and in the normal case images. Moreover, we see that Cauchy and normal distributions appear as limiting distributions—in these two cases, because of the reproductive nature of the distributions. Cauchy and normal distributions are examples of stable distributions.

Let X1, X2,… be iid RVs with common DF F. We remark without proof (see Loève [66, p. 339]) that only stable distributions occur as limits. To make this statement more precise we make the following definition.

In view of the statement after Definition 1, we see that only stable distributions possess domains of attraction. From Definition 1 we also note that each stable law belongs to its own domain of attraction. The study of stable distributions is beyond the scope of this book. We shall restrict ourselves to seeking conditions under which the limit law V is the normal distribution. The importance of the normal distribution in statistics is due largely to the fact that a wide class of distributions F belongs to the domain of attraction of the normal law. Let us consider some examples.

These examples suggest that if we take iid RVs with finite variance and take images, then images, where Z is images(0, 1). This is the central limit result, which we now prove. The reader should note that in both Examples 1 and 2 we used more than just the existence of E|X|2. Indeed, the MGF exists and hence moments of all order exist. The existence of MGF is not a necessary condition.

Remark 1. In the proof above we could have used the Taylor series expansion of M to arrive at the same result.

Remark 2. Even though we proved Theorem 1 for the case when the MGF of Xn’s exists, we will use the result whenever images. The use of CFs would have provided a complete proof of Theorem 1. Let ϕ be the CF of Xn. Assuming again, without loss of generality, that images, we can write

images

Thus the CF of images is

images

which converges to images which is the CF of a images(0, 1) RV. The devil is in the details of the proof.

The following converse to Theorem 1 holds.

For nonidentically distributed RVs we state, without proof, the following result due to Lindeberg.

Feller [24] has shown that condition (2) is necessary as well in the following sense. For independent RVs {Xk} for which (3) holds and

images

(2) holds for every images.

If images, then images, say, as images. For fixed k, we can find εk such that images and then images. For images, we have

images

so that the Lindeberg condition does not hold. Indeed, if X1, X2,…. are independent RVs such that there exists a constant A with images for all n, the Lindeberg condition (2) is satisfied if images. To see this, suppose that images. Since the Xk’s are uniformly bounded, so are the RVs XkEXk. It follows that for every images we can find an Nε such that, for images. The Lindeberg condition follows immediately. The converse also holds, for, if images and the Lindeberg condition holds, there exists a constant images such that images. For any fixed j, we can find an images such that images. Then, for images,

images

and the Lindeberg condition does not hold. This contradiction shows that images is also a necessary condition that is, for a sequence of uniformly bounded independent RVs, a necessary and sufficient condition for the central limit theorem to hold is images as images.

Remark 3. Both the central limit theorem (CLT) and the (weak) law of large numbers (WLLN) hold for a large class of sequences of RVs {Xn}. If the {Xn} are independent uniformly bounded RVs, that is, if images, the WLLN (Theorem 7.3.1) holds; the CLT holds provided that images (Example 5).

If the RVs {Xn} are iid, then the CLT is a stronger result than the WLLN in that the former provides an estimate of the probability images. Indeed,

images

where Z is images(0, 1), and the law of large number follows. On the other hand, we note that the WLLN does not require the existence of a second moment.

Remark 4. If {Xn} are independent RVs, it is quite possible that the CLT may apply to the Xn’s, but not the WLLN.

We conclude this section with some remarks concerning the application of the CLT. Let X1, X2,… be iid RVs with common mean μ and variance σ2. Let us write

images

and let images1, images2 be two arbitrary real numbers with images. If Fn is the DF of Zn, then

images

that is,

It follows that the RV images is asymptotically normally distributed with mean and variance 2. Equivalently, the RV images is asymptotically images. This result is of great importance in statistics.

In Fig. 1 we show the distribution of images in sampling from P(λ) and G (1, 1). We have also superimposed, in each case, the graph of the corresponding normal approximation.

c7-fig-0001a
c7-fig-0001b

Fig. 1 (a) Distribution of images for Poisson RV with mean 3 and normal approximation and (b) distribution of images for exponential RV with mean 1 and normal approximation.

How large should n be before we apply approximation (4)? Unfortunately the answer is not simple. Much depends on the underlying distribution, the corresponding speed of convergence, and the accuracy one desires. There is a vast amount of literature on the speed of convergence and error bounds. We will content ourselves with some examples. The reader is referred to Rohatgi [90] for a detailed discussion.

In the discrete case when the underlying distribution is integer-valued, approximation (4) is improved by applying the continuity correction. If X is integer-valued, then for integers x1, x2

images

which amounts to making the discrete space of values of X continuous by considering intervals of length 1 with midpoints at integers.

Next suppose that images. Then from binomial tables images. Using normal approximation, without continuity correction

images

and with continuity correction

images

The rule of thumb is to use continuity correction, and use normal approximation whenever images, and use Poisson approximation with images for images.

PROBLEMS 7.6

  1. Let {Xn} be a sequence of independent RVs with the following distributions. In each case, does the Lindeberg condition hold?
    1. images.
    2. images.
    3. images.
    4. {Xn} is a sequence of independent Poisson RVs with parameter images, such that images.
    5. images.
  2. Let X1, X2,… be iid RVs with mean 0, variance 1, and images. Find the limiting distribution of
    images
  3. Let X1, X2,… be iid RVs with mean α and variance σ2, and let Y1,Y2,… be iid RVs with mean images and variance τ2. Find the limiting distribution of images, where images and images.
  4. Let images. Use the CLT to find n such that images. In particular, let images and images. Calculate n, satisfying images.
  5. Let X1, X2,… be a sequence of iid RVs with common mean μ and variance σ2. Also, let images and images. Show that images where images.
  6. Let X1, X2,…, X100 be iid RVs with mean 75 and variance 225. Use Chebychev’s inequality to calculate the probability that the sample mean will not differ from the population mean by more than 6. Then use the CLT to calculate the same probability and compare your results.
  7. Let X1, X2,…,X100 be iid P(λ) RVs, where images. Let images. Use the central limit result to evaluate images and compare your result to the exact probability of the event images.
  8. Let X1, X2,…,X81 be iid RVs with mean 54 and variance 225. Use Chebychev’s inequality to find the possible difference between the sample mean and the population mean with a probability of at least 0.75. Also use the CLT to do the same.

    0.0926; 1.92

  9. Use the CLT applied to a Poisson RV to show that images for images if images, and 0 if images .
  10. Let X1,X2,… be a sequence of iid RVs with mean μ and variance σ2, and assume that images. Write images. Find the centering and norming constants An and Bn such that images, where Z is images(0, 1).
  11. From an urn containing 10 identical balls numbered 0 through 9, n balls are drawn with replacement.
    1. What does the law of large numbers tell you about the appearance of 0’s in the n drawings?
    2. How many drawings must be made in order that, with probability at least 0.95, the relative frequency of the occurrence of 0’s will be between 0.09 and 0.11?
    3. Use the CLT to find the probability that among the n numbers thus chosen the number 5 will appear between images and images times (inclusive) if (i) images and (ii) images.
  12. Let X1, X2,…,Xn be iid RVs with images and images. Let images, and for any positive real number ε let images. Show that
    images

    [Hint: Use (5.3.61.]

7.7 LARGE SAMPLE THEORY

In many applications of probability one needs the distribution of a statistic or some function of it. The methods of Section 7.3 when applicable lead to the exact distribution of the statistic under consideration. If not, it may be sufficient to approximate this distribution provided the sample size is large enough.

Let {Xn} be a sequence of RVs which converges in law to N(μ, σ2). Then images converges in law to N(0, 1), and conversely. We will say alternatively and equivalently that {Xn} is asymptotically normal with mean μ and variance σ2 More generally, we say that Xn is asymptotically normal with “mean” μn and “variance”. images, and write Xn is AN images, if images and as images.

(1)images

Here μn is not necessarily the mean of Xn and. images, not necessarily its variance. In this case we can approximate, for sufficiently large n, images by images, where Z is images(0, 1).

The most common method to show that Xn is images is the central limit theorem of Section 6. Thus, according to Theorem 7.6.1 images as images, where images is the sample mean of n iid RVs with mean μ and variance σ2. The same result applies to kth sample moment, provided images. Thus

images

In many large sample approximations an application of the CLT along with Slutsky’s theorem suffices.

Often we need to approximate the distribution of g(Yn) given that Yn is AN(μ, σ2).

Remark 1. Suppose in Theorem 1 is differentiable k times, images, at images and images for images. Then a similar argument using Taylor’s theorem shows that

(4)images

Where Z is a images(0, 1) RV. Thus in Example 2, when images. It follows that

images

Since images.

Remark 2. Theorem 1 can be extended to the multivariate case but we will not pursue the development. We refer the reader to Ferguson [29] or Serfling [102].

Remark 3. In general the asymptotic variance images of g(Yn) will depend on the parameter μ. In problems of inference it will often be desirable to use transformation g such that the approximate variance var g(Yn) is free of the parameter. Such transformations are called variance stabilizing transformations. Let us write images. Then finding a g such that var g(Yn) is free of μ is equivalent to finding a g such that

images

for all μ, where c is a constant independent of μ. It follows that

Remark 4. In Section 6.3 we computed exact moments of some statistics in terms of population parameters. Approximations for moments of images can also be obtained from series expansions of g. Suppose g is twice differentiable at images. Then

(6)images

and

(7)images

by dropping remainder terms. The case of most interest is to approximate images and images. In this case, under suitable conditions, one can show that

and

where images and images.

In Example 2, when Xi’s are iid b(1, p), and images so that

images

and

images

In this case we can compute images and images exactly. We have

images

so that (8) is exact. Also since images, using Theorem 6.3.4 we have

images

Thus the error in approximation (9) is

images

Remark 5. Approximations (6) through (9) do not assert the existence of images or images, or var g(X) or images.

Remark 6. It is possible to extend (6) through (9) to two (or more) variables by using Taylor series expansion in two (or more) variables.

Finally, we state the following result which gives the asymptotic distribution of the rth order statistic, images, in sampling from a population with an absolutely continuous DF F with PDF f. For a proof see Problem 4.

Remark 7. The sample quantile of order p, Zp, is

images

where images is the corresponding population quantile, and f is the PDF of the population distribution function. It also follows that images.

PROBLEMS 7.7

  1. In sampling from a distribution with mean μ and variance σ2 find theasymptotic distribution of
    1. images,
    2. images,
    3. images,
    4. images

    both when images and when images.

  2. Let images. Then images. Find a transformation g such that images has an asymptotic images(0, c) distribution for large μ where c is a suitable constant.
  3. Let X1,X2,…,Xn be a sample from an absolutely continuous DF F with PDF f. Show that
    images

    [Hint:Let Y be an RV with mean μ and ϕ be a Borel function such that E ϕ (Y) exists. Expand ϕ (Y) about the point μ by a Taylor series expansion, and use the fact that images.]

  4. Prove Theorem 2. [Hint: For any real μ and images compute the PDF of images and show that the standardized images, is asymptotically images(0, 1) under the conditions of the theorem.]
  5. Let images. Then images is AN(0, 1) and X/n is images. Find a transformation g such that the distribution of images is AN(0, c).
  6. Suppose X is G(1, θ). Find g such that images is AN(0, c).
  7. Let X1,X2,…,Xn be iid RVs with images. Let images and images:
    1. Show, using the CLT for iid RVs, that images.
    2. Find a transformation g such that g(S2) has an asymptotic distribution which depends on β2 alone but not on σ2.

Note

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.148.113.229