Time Series Basics
Sample ACF and Properties of AR(1) Model
This lesson defines the sample autocorrelation function (ACF) in general and derives the pattern of the ACF for an AR(1) model. Recall from Lesson 1.1 for this week that an AR(1) model is a linear model that predicts the present value of a time series using the immediately prior value in time.
Stationary Series
As a preliminary, we define an important concept, that of a stationary series. For an ACF to make sense, the series must be a weakly stationary series. This means that the autocorrelation for any particular lag is the same regardless of where we are in time.(Weakly) Stationary Series
A series- The mean
is the same for all
.
- The variance of
is the same for all
.
- The covariance (and also correlation) between
and
is the same for all
at each lag
= 1, 2, 3, etc.
Autocorrelation Function (ACF)
Let
denote the value of a time series at time
. The ACF of the series gives correlations between
and
for
= 1, 2, 3, etc. Theoretically, the autocorrelation between
and
equals
The denominator in the second formula occurs because the standard deviation of a stationary series is the same at all times.
The last property of a weakly stationary series says that the theoretical value of autocorrelation of particular lag is the same across the whole series. An interesting property of a stationary series is that, theoretically, it has the same structure forwards as it does backward.
Many stationary series have recognizable ACF patterns. Most series that we encounter in practice, however, are not stationary. A continual upward trend, for example, is a violation of the requirement that the mean is the same for all
. Distinct seasonal patterns also violate that requirement. The strategies for dealing with nonstationary series will unfold during the first three weeks of the semester.
The First-order Autoregression Model
We'll now look at the theoretical properties of the AR(1) model. Recall from Lesson 1.1 that the 1st order autoregression model is denoted as AR(1). In this model, the value of
at time
is a linear function of the value of
at time
. The algebraic expression of the model is as follows:
Assumptions
, meaning that the errors are independently distributed with a normal distribution that has mean 0 and constant variance.
- Properties of the errors
are independent of
.
- The series
is (weakly) stationary. A requirement for a stationary AR(1) is that
. We'll see why below.
Properties of the AR(1)
Formulas for the mean, variance, and ACF for a time series process with an AR(1) model follow.
- The (theoretical) mean of
is
- The variance of
is
- The correlation between observations
time periods apart is
This defines the theoretical ACF for a time series variable with an AR(1) model.
Note!
Details of the derivations of these properties are in the Appendix to this lesson for interested students.
Pattern of ACF for AR(1) Model
The ACF property defines a distinct pattern for autocorrelations. For a positive value ofThe tapering pattern:

The alternating and tapering pattern.

Example 1-3
In Example 1 of Lesson 1.1, we used an AR(1) model for annual earthquakes in the world with a seismic magnitude greater than 7. Here's the sample ACF of the series:
Lag. | ACF |
---|---|
1. | 0.541733 |
2. | 0.418884 |
3. | 0.397955 |
4. | 0.324047 |
5. | 0.237164 |
6. | 0.171794 |
7. | 0.190228 |
8. | 0.061202 |
9. | -0.048505 |
10. | -0.106730 |
11. | -0.043271 |
12. | -0.072305 |
The sample autocorrelations taper, although not as fast as they should for an AR(1). For instance, theoretically, the lag 2 autocorrelation for an AR(1) = squared value of lag 1 autocorrelation. Here, the observed lag 2 autocorrelation = .418884. That's somewhat greater than the squared value of the first lag autocorrelation (.5417332= 0.293). But, we managed to do okay (in Lesson 1.1) with an AR(1) model for the data. For instance, the residuals looked okay. This brings up an important point – the sample ACF will rarely fit a perfect theoretical pattern. A lot of the time, you just have to try a few models to see what fits.
We'll study the ACF patterns of other ARIMA models during the next three weeks. Each model has a different pattern for its ACF, but in practice, the interpretation of a sample ACF is not always so clear-cut.
A reminder: Residuals usually are theoretically assumed to have an ACF that has correlation = 0 for all lags.
Example 1-4
Here's a time series of the daily cardiovascular mortality rate in Los Angeles County, 1970-1979
There is a slight downward trend, so the series may not be stationary. To create a (possibly) stationary series, we'll examine the first differences
The time series plot of the first differences is the following:

The following plot is the sample estimate of the autocorrelation function of 1st differences:

Lag. | ACF |
---|---|
1. | -0.506029 |
2. | 0.205100 |
3. | -0.126110 |
4. | 0.062476 |
5. | -0.015190 |
This looks like the pattern of an AR(1) with a negative lag 1 autocorrelation.
The lag 2 correlation is roughly equal to the squared value of the lag 1 correlation. The lag 3 correlation is nearly exactly equal to the cubed value of the lag 1 correlation, and the lag 4 correlation nearly equals the fourth power of the lag 1 correlation. Thus an AR(1) model may be a suitable model for the first differences
Let
Some R code for this example will be given in Lesson 1.3 for this week.
Appendix Derivations of Properties of AR(1)
Generally, you won't be responsible for reproducing theoretical derivations, but interested students may want to see the derivations for the theoretical properties of an AR(1).The algebraic expression of the model is as follows:
Assumptions
, meaning that the errors are independently distributed with a normal distribution that has mean 0 and constant variance.
- Properties of the errors
are independent of
.
- The series
is (weakly) stationary. A requirement for a stationary AR(1) is that
. We'll see why below.
Mean
Variance
By the stationary assumption,
. Substitute
for
and then solve for
. Because
, it follows that
and therefore
.
Autocorrelation Function (ACF)
To start, assume the data have mean 0, which happens whenLet
Covariance and correlation between observations one time period apart
If we start at
, and move recursively forward we get
. By definition,
, so this is
. The correlation