Essay
(Requires Appendix material and Calculus)Equation (5.36)in your textbook derives the conditional variance for any old conditionally unbiased estimator 1 to be var( 1 X1, ..., Xn)= where the conditions for conditional unbiasedness are = 0 and = 1. As an alternative to the BLUE proof presented in your textbook, you recall from one of your calculus courses that you could minimize the variance subject to the two constraints, thereby making the variance as small as possible while the constraints are holding. Show that in doing so you get the OLS weights (You may assume that X1,..., Xn are nonrandom (fixed over repeated samples).)
Correct Answer:

Verified
The Lagrangian is
where the ?i are two L...View Answer
Unlock this answer now
Get Access to more Verified Answers free of charge
Correct Answer:
Verified
View Answer
Unlock this answer now
Get Access to more Verified Answers free of charge
Q27: In many of the cases discussed in
Q28: The homoskedastic normal regression assumptions are all
Q29: Consider the following two models involving
Q30: Your textbook states that under certain restrictive
Q31: The only difference between a one- and
Q33: Using data from the Current Population Survey,
Q34: If the absolute value of your calculated
Q35: The error term is homoskedastic if<br>A)var(u<sub>i</sub>
Q36: If the errors are heteroskedastic, then<br>A)OLS is
Q37: (Continuation from Chapter 4)At a recent