Least Square Regression Line

Least Square Regression:

In general, when we use ŷ<ii = b0 + bixi to predict the actual response yi, we make a prediction error of size:

ei = yi – ŷi

A line that fits the data ‘best’ will be one for which prediction errors – one for each observed data point. That is as small as possible in some overall sense. One way to achieve this goal is to invoke the least squares criterion which says to minimize the sum of the squared prediction errors. That is:

i. The equation of the best fitting line is ŷ<ii = b0 + bixi

ii. We just need to find the values b0 that make the sum of the squared prediction errors the smallest it can be.

iii. That is we need to find the values b0 and b1 that minimize.