Inferential Statistics for b and r
Inferential Statistics for b and r
- State the assumptions that inferential statistics in regression are based upon
- Identify heteroscedasticity in a scatter plot
- Compute the standard error of a slope
- Test a slope for significance
- Construct a confidence interval on a slope
- Test a correlation for significance
This section shows how to conduct significance
tests and compute confidence intervals for the regression slope
and Pearson's correlation. As you will see, if the regression
slope is significantly different from zero, then the correlation
coefficient is also significantly different from zero.
Although no assumptions were needed to determine the best-fitting straight line, assumptions are made in the calculation of inferential statistics. Naturally, these assumptions refer to the population, not the sample.
- Linearity: The relationship between the two variables is linear.
- Homoscedasticity: The variance around the regression line
is the same for all values of .
A clear violation of this assumption is shown in Figure 1. Notice that the predictions
for students with high high-school GPAs are very good, whereas
the predictions for students with low high-school GPAs are not
very good. In other words, the points for students with high
high-school GPAs are close to the regression line, whereas the
points for low high-school GPA students are not.
Figure 1. University GPA as a function of High School GPA.
- The errors of prediction are distributed normally. This means that the deviations from the regression line are normally distributed. It does not mean that or is normally distributed.
Significance Test for the Slope ()
Recall the general formula for a test:
As applied here, the statistic is the sample value of the slope () and the hypothesized value is 0. The number of degrees of freedom for this test is:
where is the number of pairs of scores.
The estimated standard error of is computed using the following formula:
where is the estimated standard error of , is the standard error of the estimate, and is the sum of squared deviations of from the mean of . is calculated as
where is the mean of . As shown previously, the standard error of the estimate can be calculated as
These formulas are illustrated with the data shown
in Table 1. These data are reproduced
from the introductory section. The column
has the values of the predictor variable and
the column has the values of the criterion variable.
The third column, , contains the differences between the
values of column and the mean of . The fourth
column, , is the square of
the column. The fifth column, , contains
the differences between the values of column and the mean of . The last
column, , is simply square of the
Table 1. Example data.
The computation of the standard error of the estimate () for these data is shown in the section on the standard error of the estimate. It is equal to 0.964.
is the sum of squared deviations from the mean of . It is, therefore, equal to the sum of the column and is equal to 10.
We now have all the information to compute the standard error of :
As shown previously, the slope () is 0.425. Therefore,
The value for a two-tailed test is 0.26. Therefore,
the slope is not significantly different from 0.
Confidence Interval for the Slope
The method for computing a confidence interval for the population slope is very similar to methods for computing other confidence intervals. For the 95% confidence interval, the formula is:
where is the value of to use for the 95% confidence interval.
The values of to be used in a confidence interval can be looked up in a table of the t distribution. A small version of such a table is shown in Table 2. The first column, , stands for degrees of freedom.
Table 2. Abbreviated t table.
You can also use the "inverse distribution" calculator to find the t values to use in a confidence interval.
Applying these formulas to the example data,
Significance Test for the Correlation
The formula for a significance test of Pearson's correlation is shown below:
where is the number of pairs of scores. For the example data,
Notice that this is the same value obtained in the test of . As in that test, the degrees of freedom is .
Source: David M. Lane , https://onlinestatbook.com/2/regression/inferential.html
This work is in the Public Domain.