Exam 6: Linear Regression With Multiple Regressors

arrow
  • Select Tags
search iconSearch Question
flashcardsStudy Flashcards
  • Select Tags

The intercept in the multiple regression model

(Multiple Choice)
4.7/5
(44)

Assume that you have collected cross-sectional data for average hourly earnings (ahe), the number of years of education (educ)and gender of the individuals (you have coded individuals as "1" if they are female and "0" if they are male; the name of the resulting variable is DFemme). Having faced recent tuition hikes at your university, you are interested in the return to education, that is, how much more will you earn extra for an additional year of being at your institution. To investigate this question, you run the following regression:  ahe ^\widehat{\text { ahe }} = -4.58 + 1.71×educ N = 14,925, R2 = 0.18, SER = 9.30 a. Interpret the regression output. b. Being a female, you wonder how these results are affected if you entered a binary variable (DFemme), which takes on the value of "1" if the individual is a female, and is "0" for males. The result is as follows:  ahe ^\widehat{\text { ahe }} = -3.44 - 4.09×DFemme + 1.76×educ N = 14,925, R2 = 0.22, SER = 9.08 Does it make sense that the standard error of the regression decreased while the regression R2 increased? c. Do you think that the regression you estimated first suffered from omitted variable bias?

(Essay)
4.7/5
(34)

Under imperfect multicollinearity

(Multiple Choice)
4.7/5
(23)

In the case of perfect multicollinearity, OLS is unable to calculate the coefficients for the explanatory variables, because it is impossible to change one variable while holding all other variables constant. To see why this is the case, consider the coefficient for the first explanatory variable in the case of a multiple regression model with two explanatory variables: β^1=i=1nyix1ii=1nx2i2i=1nyix2ii=1nx1ix2ii=1nx1i2i=1nx2i2(i=1nx1ix2i)2\hat { \beta } _ { 1 } = \frac { \sum _ { i = 1 } ^ { n } y _ { i } x _ { 1 i } \sum _ { i = 1 } ^ { n } x _ { 2 i } ^ { 2 } - \sum _ { i = 1 } ^ { n } y _ { i } x _ { 2 i } \sum _ { i = 1 } ^ { n } x _ { 1 i } x _ { 2 i } } { \sum _ { i = 1 } ^ { n } x _ { 1 i } ^ { 2 } \sum _ { i = 1 } ^ { n } x _ { 2 i } ^ { 2 } - \left( \sum _ { i = 1 } ^ { n } x _ { 1 i } x _ { 2 i } \right) ^ { 2 } } (small letters refer to deviations from means as in zi=ZiZˉz _ { i } = Z _ { i } - \bar { Z } ). Divide each of the four terms by i=1nx2i2i=1nx2i2\sum _ { i = 1 } ^ { n } x _ { 2 i } ^ { 2 } \sum _ { i = 1 } ^ { n } x _ { 2 i } ^ { 2 } to derive an expression in terms of regression coefficients from the simple (one explanatory variable)regression model. In case of perfect multicollinearity, what would be R2 from the regression of X1iX _ { 1 i } on X2iX _ { 2 i } ? As a result, what would be the value of the denominator in the above expression for β1\beta _ { 1 } ?

(Essay)
4.9/5
(34)

In multiple regression, the R2 increases whenever a regressor is

(Multiple Choice)
5.0/5
(38)
Showing 61 - 65 of 65
close modal

Filters

  • Essay(0)
  • Multiple Choice(0)
  • Short Answer(0)
  • True False(0)
  • Matching(0)