Linear Regression

Let

   {(Yi, Xi1, Xi2, ..., Xip); i = 1 ...n}

represent a set of n measurements of p independent variables and 1 dependent variable. We define the general linear regression model simply in terms of the p X variables:

   Yi = β0 + β1Xi1 + β2Xi2 + ... + βpXip + εi

where

   β0, β1, ..., βp are parameters
   Xi1, Xi2, ..., Xip are the i-th measurements of the p independent variables
   εi are independent N(0, σ2)
   i = 1, ..., n

Linear Regression estimates β0, β1, ..., βp by least squares. When Intercept? is not checked, β0 is set to zero before the rest parameters are estimated by least squares.

Let b0, b1, ..., bp be the least squares estimates of β0, β1, ..., βp. The fitted values are

  i = b0 + b1Xi1 + b2Xi2 + ... + bpXip , where i = 1, ..., n.

The residuals are

   Yi - Ŷi = ei, where i = 1, ..., n.