Explore BrainMass
Share

# Unweighted and Weighted Linear Regression

This content was COPIED from BrainMass.com - View the original, and get the already-completed solution here!

See the attached fie.

a) Consider the following set of regression equations:

Y1 = βx1 + e1
Y2 = βx2+ e2
..
Yn = βxn + en

Suppose also that w1, w2, ..., wn are a set of positive weights (known constants). Consider the function

f(β) = ∑ wiei2 = ∑ wi (yi - βxi)2
Find the value of β that minimizes f(β). (This value of β is called the weighted least squares estimate of β.)

b) and c) use the following random sample of n = 6 pairs of values of x and Y.
x Y
1 3
6 7
4 12
2 5
1 4
3 5

b) For the regression model in part a), find the (unweighted) least squares estimate of β using this data. Explain the meaning of this value.

c) Suppose that the weights described in part a) are 1,1,1,2,2,3, respectively. For the regression model in part a), find the weighted least squares estimate of β. Does it differ from your answer in b)? Can you explain why?

https://brainmass.com/math/interpolation-extrapolation-and-regression/unweighted-weighted-linear-regression-427173

#### Solution Preview

a) We have

f(beta) = sum[w_i(Y_i - beta x_i)^2].

To find the value of beta which minimizes f(beta), we calculate f'(beta) and set it equal to zero. Thus we have

0 = f'(beta) = sum[-2w_i x_i(Y_i - beta x_i)]
= -2 [sum(w_i x_i Y_i) - beta sum(w_i x_i^2)],

whence

beta = sum(w_i ...

#### Solution Summary

The solution computes the slope of the regression line (least squares fit) for a given set of data points. We then recompute the slope by using various weighting factors for each point.

\$2.19