Presentation is loading. Please wait.

Presentation is loading. Please wait.

Inference for the mean vector. Univariate Inference Let x 1, x 2, …, x n denote a sample of n from the normal distribution with mean  and variance 

Similar presentations


Presentation on theme: "Inference for the mean vector. Univariate Inference Let x 1, x 2, …, x n denote a sample of n from the normal distribution with mean  and variance "— Presentation transcript:

1 Inference for the mean vector

2 Univariate Inference Let x 1, x 2, …, x n denote a sample of n from the normal distribution with mean  and variance  2. Suppose we want to test H 0 :  =  0 vs H A :  ≠  0 The appropriate test is the t test: The test statistic: Reject H 0 if |t| > t  /2

3 The multivariate Test Let denote a sample of n from the p-variate normal distribution with mean vector  and covariance matrix . Suppose we want to test

4 Roy’s Union- Intersection Principle This is a general procedure for developing a multivariate test from the corresponding univariate test. 1.Convert the multivariate problem to a univariate problem by considering an arbitrary linear combination of the observation vector.

5 2.Perform the test for the arbitrary linear combination of the observation vector. 3.Repeat this for all possible choices of 4.Reject the multivariate hypothesis if H 0 is rejected for any one of the choices for 5.Accept the multivariate hypothesis if H 0 is accepted for all of the choices for 6.Set the type I error rate for the individual tests so that the type I error rate for the multivariate test is .

6 Let denote a sample of n from the p-variate normal distribution with mean vector  and covariance matrix . Suppose we want to test Application of Roy’s principle to the following situation Then u 1, …. u n is a sample of n from the normal distribution with mean and variance.

7 to test we would use the test statistic:

8 and

9 Thus We will reject if

10 We will reject Using Roy’s Union- Intersection principle: We accept

11 We reject i.e. We accept

12 Consider the problem of finding: where

13 thus

14 We reject Thus Roy’s Union- Intersection principle states: We accept is called Hotelling’s T 2 statistic

15 We reject Choosing the critical value for Hotelling’s T 2 statistic, we need to find the sampling distribution of T 2 when H 0 is true. It turns out that if H 0 is true than has an F distribution with 1 = p and 2 = n - p

16 We reject Thus Hotelling’s T 2 test or if

17 Another derivation of Hotelling’s T 2 statistic Another method of developing statistical tests is the Likelihood ratio method. Suppose that the data vector,, has joint density Suppose that the parameter vector,, belongs to the set . Let  denote a subset of . Finally we want to test

18 The Likelihood ratio test rejects H 0 if

19 The situation Let denote a sample of n from the p-variate normal distribution with mean vector  and covariance matrix . Suppose we want to test

20 The Likelihood function is: and the Log-likelihood function is:

21 the Maximum Likelihood estimators of are and

22 the Maximum Likelihood estimators of when H 0 is true are: and

23 The Likelihood function is: now

24 Thus similarly

25 and

26 Note: Let

27 and Now and

28 Also

29 Thus

30 using

31 Then Thus to reject H 0 if <  This is the same as Hotelling’s T 2 test if

32 Example For n = 10 students we measure scores on –Math proficiency test (x 1 ), –Science proficiency test (x 2 ), –English proficiency test (x 3 ) and –French proficiency test (x 4 ) The average score for each of the tests in previous years was 60. Has this changed?

33 The data

34 Summary Statistics

35 Simultaneous Inference for means Recall (Using Roy’s Union Intersection Principle)

36 Now

37 Thus and the set of intervals Form a set of (1 –  )100 % simultaneous confidence intervals for

38 Recall Thus the set of (1 –  )100 % simultaneous confidence intervals for

39 The two sample problem

40 Univariate Inference Let x 1, x 2, …, x n denote a sample of n from the normal distribution with mean  x and variance  2. Let y 1, y 2, …, y m denote a sample of n from the normal distribution with mean  y and variance  2. Suppose we want to test H 0 :  x =  y vs H A :  x ≠  y

41 The appropriate test is the t test: The test statistic: Reject H 0 if |t| > t  /2 d.f. = n + m -2

42 The multivariate Test Let denote a sample of n from the p-variate normal distribution with mean vector  and covariance matrix . Suppose we want to test Let denote a sample of m from the p-variate normal distribution with mean vector  and covariance matrix .

43 Hotelling’s T 2 statistic for the two sample problem if H 0 is true than has an F distribution with 1 = p and 2 = n +m – p - 1

44 We reject Thus Hotelling’s T 2 test

45 Simultaneous inference for the two-sample problem Hotelling’s T 2 statistic can be shown to have been derived by Roy’s Union-Intersection principle

46 Thus

47

48 Hence

49 Thus form 1 –  simultaneous confidence intervals for

50 Example Annual financial data are collected for firms approximately 2 years prior to bankruptcy and for financially sound firms at about the same point in time. The data on the four variables x 1 = CF/TD = (cash flow)/(total debt), x 2 = NI/TA = (net income)/(Total assets), x 3 = CA/CL = (current assets)/(current liabilties, and x 4 = CA/NS = (current assets)/(net sales) are given in the following table.

51 The data are given in the following table:

52 Hotelling’s T 2 test A graphical explanation

53 Hotelling’s T 2 statistic for the two sample problem

54 is the test statistic for testing:

55 Popn A Popn B X1X1 X2X2 Hotelling’s T 2 test

56 Popn A Popn B X1X1 X2X2 Univariate test for X 1

57 Popn A Popn B X1X1 X2X2 Univariate test for X 2

58 Popn A Popn B X1X1 X2X2 Univariate test for a 1 X 1 + a 2 X 2

59 Mahalanobis distance A graphical explanation

60 Euclidean distance

61 Mahalanobis distance: , a covariance matrix

62 Hotelling’s T 2 statistic for the two sample problem

63 Popn A Popn B X1X1 X2X2 Case I

64 Popn A Popn B X1X1 X2X2 Case II

65 In Case I the Mahalanobis distance between the mean vectors is larger than in Case II, even though the Euclidean distance is smaller. In Case I there is more separation between the two bivariate normal distributions


Download ppt "Inference for the mean vector. Univariate Inference Let x 1, x 2, …, x n denote a sample of n from the normal distribution with mean  and variance "

Similar presentations


Ads by Google