Inferential Methods in Regression and Correlation
[Pages:48]Inferential Methods in Regression and Correlation
Chapter 11
Back to Ch.3 (Linear Regression):
? Recall Simple Linear Regression:
? Fit a line in the data when you see a linear trend ? Minimizing the errors using LS method ? Get estimates of slope and intercept accordingly ? Random residuals
? In this chapter, we introduce regression as a sample that we want to draw inference on
Concept
? Remember when we used X to estimate ?? ? What did we do?
? Used confidence intervals to give a guess where ? falls
? Used hypothesis testing to check specific hypotheses for ?
? Treat the regression line similarly
? Need to understand the sample distribution again!
Regression line
? We learned the regression as = a + bx
? That is the sample regression line
? The true regression line we write as a model:
yi = + xi + ei ? In this model:
?
eeri rios rt,hiet
"error" term for the ith observation, without is called the population regression line
this
? this means that, without the error term, every point would fall exactly on the line
? ei is assumed to follow a normal distribution with mean 0 and standard deviation
? Additionally, all ei 's are assumed independent of each other
Let's visualize using an example
? Suppose we use Age to predict Blood Pressure
? Which is X? Y?
? Draw a picture...
? For any fixed x, the dependent y has a normal distribution
? The mean of y falls on the "population regression line"
? Another way to say the same thing is just:
ei ~ N(0, )
Estimating the slope and intercept
? Still apply the same formulas from Chapter 3 for the Least Squares estimates
? The Least Squares method gives:
Only estimates
? Of course, these are only sample estimates
? If we took a different sample, we would get different estimates
? Need to use these estimates a and b to draw inference about the "real" slope and intercept, and
? Need to know the sampling distribution...
? No problem. ? We already know it's normal, the important
statement is this again: ei ~ N(0, )
Estimating the error variance
? From the model, we know that ei ~ N(0, ) ? To estimate we use the residuals
? After the slope and intercept is estimated, the residuals are calculated as: e^i = yi - y^i
? SSE is calculated as: e2 = (yi - y^i )2
? It is used to estimate by:
se =
SSE =
n-2
MSE
? Why n ? 2? We calculating two estimates, a and b, hence we lose 2 df
................
................
In order to avoid copyright disputes, this page is only a partial summary.
To fulfill the demand for quickly locating and searching documents.
It is intelligent file search solution for home and business.
Related download
- anova and r squared revisited multiple regression and r
- chapter 6 randomized block design two factor anova
- introduction to probability and statistics twelfth edition
- unit 2 two variable data
- ma 180 418 final exam version a
- math 128 elementary statistics spring 2018
- recall positive negative height and handspan association
- correlation and the analysis of variance approach to
- inferential methods in regression and correlation
- quantitative approaches contents lesson 10 bivariate
Related searches
- methods of monitoring and evaluation
- regression and correlation analysis examples
- regression and correlation analysis pdf
- approaches and methods in language
- multiple regression and correlation analysis
- regression and correlation pdf
- regression analysis correlation analysis
- linear regression and correlation pdf
- regression analysis and correlation analysis
- regression vs correlation analysis
- approaches and methods in language teaching
- inferential statistics in healthcare examples