Friday 27 April 2012

Line of best fit - least squares regression

In this blog we are going to discuss about the topic Line of best fit least squares regression that is defined the line that produces the smallest value of the sum of squares of the residuals. If we define the term residual is explained the vertical distance from a point on a scatter diagram that is to the line of best fit. So it will be least square regression line as the best line of best fit.
Define Regression : Regression produce the least square regression line and line can be expressed as y' = a + b x and ‘a’ and ‘b’ are the constant values and here y' is the predicted value that is based on the value of ‘x’ and ‘y’ that are independent variables.
If we try to find Regression Definition by an example as if the values of constant are a =10 and b = 7 and value of x = 10 then according to the formula the predicted value of y' is calculated 45 that is defined as (10 + 5 * 7). It will be occurring with any two variables ‘x’ and ‘y’. There is one equation produce the “best fit” linking ‘x’ to ‘y’. So according to this there will be one formula that will produce the best and accurate prediction for ‘y’ that is given for ‘x’. So this equation is called as the least square regression equation denotes as the Line of best fit. (visit for more information here)
The equation for the Least square regression of ‘y’ on ‘x’ is defined as:
y = y ' = b (x – x ')
Here ‘b’ is defined as:
b = Px y/P x x = ∑(x – x ') (y – y ') / ∑((x – x ')2 = ∑x y – x' y' / ∑(x2 / n) – x'2 .
This is all about least squares regression.
 In upcoming posts we will discuss about Standard distributions and Fundamental theorem of algebra. Visit our website for information on higher secondary education Karnataka

No comments:

Post a Comment