Exam 18: The Theory of Multiple Regression

arrow
  • Select Tags
search iconSearch Question
  • Select Tags

The extended least squares assumptions in the multiple regression model include four assumptions from Chapter 6 (ui has conditional mean zero; (Xi,Yj),i=1,,n\left( u _ { i } \text { has conditional mean zero; } \left( \boldsymbol { X } _ { i } , Y _ { j } \right) , i = 1 , \ldots , n \right. are i.i.d. draws from their joint distribution; Xiand ui\boldsymbol { X } _ { i } \text {and } u _ { i } have nonzero finite fourth moments; there is no perfect multicollinearity). In addition, there are two further assumptions, one of which is

(Multiple Choice)
4.8/5
(47)

The assumption that X has full column rank implies that

(Multiple Choice)
4.8/5
(29)

Let there be q joint hypothesis to be tested. Then the dimension of r in the expression Rβ=rR \beta = r is

(Multiple Choice)
4.9/5
(31)

Minimization of i=1n(Yib0b1X1ibkXki)2\sum _ { i = 1 } ^ { n } \left( Y _ { i } - b _ { 0 } - b _ { 1 } X _ { 1 i } - \cdots - b _ { k } X _ { k i } \right) ^ { 2 } results in

(Multiple Choice)
4.9/5
(41)

Give several economic examples of how to test various joint linear hypotheses using matrix notation. Include specifications of Rβ=rR \beta = r where you test for (i) all coefficients other than the constant being zero, (ii) a subset of coefficients being zero, and (iii) equality of coefficients. Talk about the possible distributions involved in finding critical values for your hypotheses.

(Essay)
4.9/5
(32)

For the OLS estimator β^=(XX)1XY\hat { \boldsymbol { \beta } } = \left( \boldsymbol { X } ^ { \prime } \boldsymbol { X } \right) ^ { - 1 } \boldsymbol { X } ^ { \prime } \boldsymbol { Y } to exist, XXX ^ { \prime } X ^ { \prime } must be invertible. This is the case when XX has full rank. What is the rank of a matrix? What is the rank of the product of two matrices? Is it possible that XX could have rank n ? What would be the rank of XXX ^ { \prime } X in the case n<(k+1) ? n < ( k + 1 ) \text { ? } Explain intuitively why the OLS estimator does not exist in that situation.

(Essay)
4.9/5
(41)

The heteroskedasticity-robust estimator of n(β^β)\sum _ { \sqrt { n } ( \hat { \beta } - \beta ) } is obtained

(Multiple Choice)
4.8/5
(27)

The multiple regression model in matrix form Y=Xβ+U\boldsymbol { Y } = \boldsymbol { X } \boldsymbol { \beta } + \boldsymbol { U } can also be written as

(Multiple Choice)
4.8/5
(38)

The GLS estimator

(Multiple Choice)
4.8/5
(28)

The leading example of sampling schemes in econometrics that do not result in independent observations is

(Multiple Choice)
4.8/5
(37)

The Gauss-Markov theorem for multiple regression proves that a. MX\boldsymbol { M } _ { \boldsymbol { X } } is an idempotent matrix. b. the OLS estimator is BLUE. c. the OLS residuals and predicted values are orthogonal. d. the variance-covariance matrix of the OLS estimator is σu2(XX)1\sigma _ { u } ^ { 2 } ( \boldsymbol { X } \boldsymbol { X } ) ^ { - 1 } .

(Short Answer)
4.8/5
(46)

Define the GLS estimator and discuss its properties when Ω\Omega is known. Why is this estimator sometimes called infeasible GLS? What happens when Ω\Omega is unknown? What would the Ω\Omega matrix look like for the case of independent sampling with heteroskedastic errors, where var(uiXi)=ch(Xi)=σ2X1i2?\operatorname { var } \left( u _ { i } \mid X _ { i } \right) = \operatorname { ch } \left( X _ { i } \right) = \sigma ^ { 2 } X _ { 1 i } ^ { 2 } ? Since the inverse of the error variancecovariance matrix is needed to compute the GLS estimator, find Ω1\Omega ^ { - 1 } . The textbook shows that the original model Y=Xβ+U\mathbf { Y } = \mathbf { X } \beta + \mathbf { U } will be transformed into Y~=X~β+U~\tilde { \boldsymbol { Y } } = \tilde { \boldsymbol { X } } \boldsymbol { \beta } + \tilde { \boldsymbol { U } } where Y~=FY,X~=FX, and U~=FU, and FF=Ω1\tilde { \boldsymbol { Y } } = \boldsymbol { F } \boldsymbol { Y } , \tilde { \boldsymbol { X } } = \boldsymbol { F } \boldsymbol { X } \text {, and } \tilde { \boldsymbol { U } } = \boldsymbol { F } \boldsymbol { U } \text {, and } \boldsymbol { \boldsymbol { F } ^ { \prime } \boldsymbol { F } } = \boldsymbol { \Omega } ^ { -1 } Find FF in the above case, and describe what effect the transformation has on the original data.

(Essay)
4.9/5
(35)

β^β\hat { \beta } - \beta

(Multiple Choice)
4.9/5
(45)

The linear multiple regression model can be represented in matrix notation as Y=Xβ+U\boldsymbol { Y } = \boldsymbol { X } \boldsymbol { \beta } + \boldsymbol { U } where X is of order n x(k+1) . k represents the number of

(Multiple Choice)
4.9/5
(38)

Prove that under the extended least squares assumptions the OLS estimator β^\hat { \boldsymbol { \beta } } is unbiased and that its variance-covariance matrix is σu2(XX)1\sigma _ { u } ^ { 2 } \left( \boldsymbol { X } ^ { \prime } \boldsymbol { X } \right) ^ { - 1 }

(Essay)
4.9/5
(42)

The GLS assumptions include all of the following, with the exception of a. the Xi\boldsymbol { X } _ { i } are fixed in repeated samples. b. XiX _ { i } and uiu _ { i } have nonzero finite fourth moments. c. E(UUX)=Ω(X)E \left( \boldsymbol { U } \boldsymbol { U } ^ { \prime } \mid \boldsymbol { X } \right) = \boldsymbol { \Omega } ( \boldsymbol { X } ) , where Ω(X)\boldsymbol { \Omega } ( \boldsymbol { X } ) is n×nn \times n matrix-valued that can depend on X\boldsymbol { X } . d. E(UX)=0nE ( \boldsymbol { U } \mid \boldsymbol { X } ) = \mathbf { 0 } _ { n } .

(Short Answer)
4.8/5
(42)

The formulation Rβ=r\boldsymbol { R } \boldsymbol { \beta } = \boldsymbol { r } to test a hypotheses

(Multiple Choice)
4.9/5
(43)

Your textbook derives the OLS estimator as β^=(XX)1XY\hat { \boldsymbol { \beta } } = \left( \boldsymbol { X } ^ { \prime } \boldsymbol { X } \right) ^ { - 1 } \boldsymbol { X } ^ { \prime } \boldsymbol { Y } Show that the estimator does not exist if there are fewer observations than the number of explanatory variables, including the constant.What is the rank of X′X in this case?

(Essay)
4.8/5
(36)
Showing 21 - 38 of 38
close modal

Filters

  • Essay(0)
  • Multiple Choice(0)
  • Short Answer(0)
  • True False(0)
  • Matching(0)