Simple Linear Regression

Submitted By sefa
Words: 447
Pages: 2

Chapter 11 – Simple linear regression

Types of Regression Models (Sec. 11-1)

Linear Regression:

• - Outcome of Dependent Variable (response) for ith experimental/sampling unit
• - Level of the Independent (predictor) variable for ith experimental/sampling unit
• - Linear (systematic) relation between Yi and Xi (aka conditional mean)
• - Mean of Y when X=0 (Y-intercept)
• - Change in mean of Y when X increases by 1 (slope)
• - Random error term

Note that and are unknown parameters. We estimate them by the least squares method.

Polynomial (Nonlinear) Regression:

This model allows for a curvilinear (as opposed to straight line) relation. Both linear and polynomial regression are susceptible to problems when predictions of Y are made outside the range of the X values used to fit the model. This is referred to as extrapolation.

Least Squares Estimation (Sec. 11-2)

1. Obtain a sample of n pairs (X1,Y1)…(Xn,Yn).
2. Plot the Y values on the vertical (up/down) axis versus their corresponding X values on the horizontal (left/right) axis.
3. Choose the line that minimizes the sum of squared vertical distances from observed values (Yi) to their fitted values ( ) Note:
4. b0 is the Y-intercept for the estimated regression equation
5. b1 is the slope of the estimated regression equation

Measures of Variation (Sec. 11-3)

Sums of Squares

 Total sum of squares = Regression sum of squares + Error sum of squares
 Total variation = Explained variation + Unexplained variation
 Total sum of squares (Total Variation):
 Regression sum of squares (Explained Variation):
 Error sum of squares (Unexplained Variation):

Coefficients of Determination and Correlation

Coefficient of Determination

 Proportion of variation in Y “explained” by the regression on X

Coefficient of Correlation

 Measure of the direction and strength of the linear association between Y and X

Standard Error of the Estimate (Residual Standard Deviation)

 Estimated standard