Exam 18: The Theory of Multiple Regression

arrow
  • Select Tags
search iconSearch Question
  • Select Tags

The Gauss-Markov theorem for multiple regression proves that

(Multiple Choice)
4.8/5
(30)

The presence of correlated error terms creates problems for inference based on OLS. These can be overcome by

(Multiple Choice)
4.8/5
(38)

Write the following four restrictions in the form Rβ = r, where the hypotheses are to be tested simultaneously. β3 = 2β5, β1 + β2 = 1, β4 = 0, β2 = -β6. Can you write the following restriction β2 = - β3β1\frac { \beta _ { 3 } } { \beta _ { 1 } } in the same format? Why not?

(Essay)
4.9/5
(31)

The GLS assumptions include all of the following, with the exception of

(Multiple Choice)
4.8/5
(33)

You have obtained data on test scores and student-teacher ratios in region A and region B of your state. Region B, on average, has lower student-teacher ratios than region A. You decide to run the following regression. Yi = β0+ β1X1i + β2X2i + β3X3i+ui where X1 is the class size in region A, X2 is the difference between the class size between region A and B, and X3 is the class size in region B. Your regression package shows a message indicating that it cannot estimate the above equation. What is the problem here and how can it be fixed? Explain the problem in terms of the rank of the X matrix.

(Essay)
4.8/5
(39)

The homoskedasticity-only F-statistic is

(Multiple Choice)
4.7/5
(32)

Consider the following population regression function: Y = X? + U where Y= [Y1Y2Yn]\left[ \begin{array} { l } Y _ { 1 } \\Y _ { 2 } \\\ldots \\Y _ { n }\end{array} \right] , X= [1X11X21Xn]\left[ \begin{array} { l l } 1 & X _ { 1 } \\1 & X _ { 2 } \\\ldots & \ldots \\1 & X _ { n }\end{array} \right] , ? = [β0β1]\left[ \begin{array} { l } \beta _ { 0 } \\\beta _ { 1 }\end{array} \right] , U= [u1u2un]\left[ \begin{array} { c } u_1 \\u_2 \\\ldots \\u_\text{n}\end{array} \right] Given the following information on population growth rates (Y)and education (X)for 86 countries i=1nYi=1.594\sum _ { i = 1 } ^ { n } Y _ { i } = 1.594 , i=1nxi=449.6\sum _ { i = 1 } ^ { n } x _ { i } = 449.6 , i=1nYi2=0.03982\sum _ { i = 1 } ^ { n } Y _ { i } ^ { 2 } = 0.03982 , i=1nxi2=3,022.76\sum _ { i = 1 } ^ { n } x _ { i } ^ { 2 } = 3,022.76 , i=1nxiYi=6.4697\sum _ { i = 1 } ^ { n } x _ { i } Y _ { i } = 6.4697 a)find X'X, X'Y, (X'X)-1 and finally (X'X)-1 X'Y. b)Interpret the slope, and if necessary, the intercept.

(Essay)
4.8/5
(26)

Define the GLS estimator and discuss its properties when Ω is known. Why is this estimator sometimes called infeasible GLS? What happens when Ω is unknown? What would the Ω matrix look like for the case of independent sampling with heteroskedastic errors, where var(ui | Xi)= ch(Xi)= σ2 x1i2x _ { 1 i } ^ { 2 } ? Since the inverse of the error variance-covariance matrix is needed to compute the GLS estimator, find Ω-1. The textbook shows that the original model Y = Xβ + U will be transformed into Y~=X~β+U~, where Y~=FY,X~=FX, and U~\widetilde { Y } = \widetilde { X } \beta + \widetilde { U } , \text { where } \widetilde { Y } = F Y , \widetilde { X } = F X \text {, and } \widetilde { U } = FU, and FF F = Ω-1. Find F in the above case, and describe what effect the transformation has on the original data.

(Essay)
4.9/5
(38)

Let PX = X( XX ^ { \prime } X)-1 XX ^ { \prime } and MX = In - PX. Then MX MX =

(Multiple Choice)
4.8/5
(31)

The GLS estimator is defined as

(Multiple Choice)
4.9/5
(32)
Showing 41 - 50 of 50
close modal

Filters

  • Essay(0)
  • Multiple Choice(0)
  • Short Answer(0)
  • True False(0)
  • Matching(0)