Explore BrainMass

# Simple Linear Regression Model and Matrix Notation

Not what you're looking for? Search our solutions OR ask your own Custom question.

This content was COPIED from BrainMass.com - View the original, and get the already-completed solution here!

See the attached file.

Consider the simple linear regression model Yi = Bo + B1Xi + ei(i = ,2,..., n). The model may be written in matrix notation as Y = XB + e.

a) Explain the terms Y, X, B and e.
b) State the second-order distributional assumptions in matrix notation and then the normal theory assumptions using matrix notations.
c) Write the elements of X'X and X'Y.
d) Show that the error sum of squares may be written as (Y-XB)'(Y-XB).
e) The least squares estimate B(hat) of B minimize (Y-XB)'(Y-XB). State the normal equations (least squares equations) in matrix form.
f) Assuming the X (an hence X'X) is of full rank, obtain B(hat) using (e).
g) Show that B(hat) is an unbiased estimator of B.
h) Derive Cov(B(hat)), the variance-covariance matrix of B(hat).
i) What is the Maximum Likelihood Estimate of B when normality assumptions are made?
j) Show that the vector of fitted values Y(hat) may be written as Y(hat)=HY where H=[X(X'X)(^-1)](X').