After we use the method of least-squares to calculate regression coefficients (b0 and b1) and we validate the LINE assumptions, we next turn to evaluating the regression, specifically the slope, b1 and ask two questions:
1. Is it statistically significant?
2. What is the confidence interval for b1?
The first question (we actually covered this after the second question in class), whether b1 is statistically significant, is determined by asking: Is it any better than a flat horizontal line through the data?
We answer this question by making a hypothesis that the true relationship slope, β1 is 0 and using our skills at hypothesis testing to determine whether we should reject that hypothesis.
H0: β1 = 0
H1: β1 ≠ 0
The t statistic that we use to test the hypothesis is:
t = (b1-β1)/Sb1
where Sb1 is the standard error of the slope.
In our case, β1 is 0 according to our hypothesis, so t reduces to:
t = b1/Sb1
The standard error of the slope, Sb1, is defined as:
Sb1 = SXY/SQRT(SSX)
where SXY is the standard error of the estimate.
The standard error of the estimate, SXY, is defined as:
SXY = SQRT(SSE/n-2)
So, if we have our calculations of SSX and SSE, we can do the math and find Sb1 and the t-score for b1.
We finish our hypothesis testing by comparing the t-score for b1 to tα/2, n-2, where α is our level of significance.
If t is beyond tα/2, n-2 (either on the positive or negative end), we conclude that the hypothesis, H0, must be rejected.
We could also make the conclusion based on the p-value of the t-score. If the p-value is less than α/2, then we reject H0.
**Confidence interval for b1 will be covered in the next blog post.**
Tuesday, March 4, 2008
Lecture 8 - Inferences About the Regression Slope
Posted by Eliezer at Tuesday, March 04, 2008
Tags: Lecture Notes
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment