expand icon
book Introductory Econometrics 4th Edition by Jeffrey Wooldridge cover

Introductory Econometrics 4th Edition by Jeffrey Wooldridge

Edition 4ISBN: 978-0324660609
book Introductory Econometrics 4th Edition by Jeffrey Wooldridge cover

Introductory Econometrics 4th Edition by Jeffrey Wooldridge

Edition 4ISBN: 978-0324660609
Exercise 20
Consider the simple regression model y = 0 + 1 x + u under the first four Gauss Markov assumptions. For some function g(x), for example g(x) = x2 or g(x) = log(1 + x2), define zt = g(x1). Define a slope estimator as Consider the simple regression model y = 0 + 1 x + u under the first four Gauss Markov assumptions. For some function g(x), for example g(x) = x2 or g(x) = log(1 + x2), define zt = g(x1). Define a slope estimator as    Show that 1 is linear and unbiased. Remember, because E(u|x) = 0, you can treat both x. and zt as nonrandom in your derivation. (ii) Add the homoskedasticity assumption, MLR.5. Show that    (iii) Show directly that, under the Gauss-Markov assumptions, Var(r1) 1 is the OLS estimator. [Hint: The Cauchy-Schwartz inequality in Appendix B implies that    notice that we can drop x from the sample covariance.]
Show that 1 is linear and unbiased. Remember, because E(u|x) = 0, you can treat both x. and zt as nonrandom in your derivation.
(ii) Add the homoskedasticity assumption, MLR.5. Show that Consider the simple regression model y = 0 + 1 x + u under the first four Gauss Markov assumptions. For some function g(x), for example g(x) = x2 or g(x) = log(1 + x2), define zt = g(x1). Define a slope estimator as    Show that 1 is linear and unbiased. Remember, because E(u|x) = 0, you can treat both x. and zt as nonrandom in your derivation. (ii) Add the homoskedasticity assumption, MLR.5. Show that    (iii) Show directly that, under the Gauss-Markov assumptions, Var(r1) 1 is the OLS estimator. [Hint: The Cauchy-Schwartz inequality in Appendix B implies that    notice that we can drop x from the sample covariance.]
(iii) Show directly that, under the Gauss-Markov assumptions, Var(r1) 1 is the OLS estimator. [Hint: The Cauchy-Schwartz inequality in Appendix B implies that Consider the simple regression model y = 0 + 1 x + u under the first four Gauss Markov assumptions. For some function g(x), for example g(x) = x2 or g(x) = log(1 + x2), define zt = g(x1). Define a slope estimator as    Show that 1 is linear and unbiased. Remember, because E(u|x) = 0, you can treat both x. and zt as nonrandom in your derivation. (ii) Add the homoskedasticity assumption, MLR.5. Show that    (iii) Show directly that, under the Gauss-Markov assumptions, Var(r1) 1 is the OLS estimator. [Hint: The Cauchy-Schwartz inequality in Appendix B implies that    notice that we can drop x from the sample covariance.]
notice that we can drop x from the sample covariance.]
Explanation
Verified
like image
like image

(i)
For simplicity, define blured image ; this is no...

close menu
Introductory Econometrics 4th Edition by Jeffrey Wooldridge
cross icon