4.1 Time series analysis in SPSS (17 and higher)

4.1.1 Nonlinear Growth curves in SPSS

This is data from an iteration of the logistic growth differential equation you are familiar with by now, but let’s pretend it’s data from one subject measured on 100 occasions.

  1. Plot Y(t) against Time Recognize the shape?
  2. To get the growth parameter we’ll try to fit the solution of the logistic flow with SPSS nonlinear regression
    • Select nonlinear… from the Analysis >> Regression menu.
    • Here we can build the solution equation. We need three parameters: a. Yzero, the initial condition. b. K, the carrying capacity. c. r, the growth rate.
    • Fill these in where it says parameters give all parameters a starting value of \(0.01\)
  3. Take a good look at the analytic solution of the (stilized) logistic flow:

\[ Y(t) = \frac{K * Y_0}{Y_0 + \left(K-Y_{0}\right) * e^{(-K*r*t)} } \]

Tr to build this equation, the function fo \(e\) is called EXP in SPSS (Function Group >> Arithmetic) Group terms by using parentheses as shown in the equation.

  1. If you think you have built the model correctly, click on Save choose predicted values. Then paste your syntax and run it!
    • Check the estimated parameter values.
    • Check \(R^2\)!!!
  2. Plot a line graph of both the original data and the predicted values. (Smile)

  3. A polynomial fishing expedition:
    • Create time-varying covariates of \(Y(t)\):

      COMPUTE T1=Yt * Time.
      COMPUTE T2=Yt * (Time ** 2). 
      COMPUTE T3=Yt * (Time ** 3). 
      COMPUTE T4=Yt * (Time ** 4). 
      EXECUTE.
    • Use these variables as predictors of \(Y(t)\) in a regular linear regression analysis. This is called a polynomial regression: Fitting combinations of curves of different shapes on the data.
    • Before you run the analysis: Click Save Choose Predicted Values: Unstandardized

  4. Look at \(R^2\). This is also almost 1. Which model is better? Think about this: Based o the results o the linear regression what can yo tell about the growth rate, the carrying capacity or the initial condition?

  5. Create a line graph of \(Y(t)\), plot the predicted values of the nonlinear regression and the unstandardized predicted values of the linear polynomial regression against time in one figure.

  6. Now you can see that the shape is approximated by the polynomials, but it is not quite the same. Is this really a model of a growth process as we could encounter it in nature?

| jump to solution |

4.1.2 Correlation functions and AR-MA models

  1. Download the file series.sav from blackboard. It contains three time series TS_1, TS_2 and TS_3. As a first step look at the mean and the standard deviation (Analyze >> Descriptives). Suppose these were time series from three subjects in an experiment, what would you conclude based on the means and SD’s?

  2. Let’s visualize these data. Go to Forecasting >> Time Series >> Sequence Charts. Check the box One chart per variable and move all the variables to Variables. Are they really the same?

  3. Let’s look at the ACF and PCF
    • Go to Analyze >> Forecasting >> Autocorrelations.
    • Enter all the variables and make sure both Autocorrelations (ACF) and Partial autocorrelations (PACF) boxes are checked. Click Options, and change the Maximum Number of Lags to 30.
    • Use the table to characterize the time series:
SHAPE INDICATED MODEL
Exponential, decaying to zero Autoregressive model. Use the partial autocorrelation plot to identify the order of the autoregressive model
Alternating positive and negative, decaying to zero Autoregressive model. Use the partial autocorrelation plot to help identify the order.
One or more spikes, rest are essentially zero Moving average model, order identified by where plot becomes zero.
Decay, starting after a few lags Mixed autoregressive and moving average model.
All zero or close to zero Data is essentially random.
High values at fixed intervals Include seasonal autoregressive term.
No decay to zero Series is not stationary.
  1. You should have identified just one time series with autocorrelations: TS_2. Try to fit an ARIMA(p,0,q) model on this time series.
    • Go to Analyze >> Forecasting >> Create Model, and at Method (Expert modeler) choose ARIMA.
    • Look back at the PACF to identify which order (p) you need (last lag value at which the correlation is still significant). This lag value should go in the Autocorrelation p box.
    • Start with a Moving Average q of one. The time series variable TS_2 is the Dependent.
    • You can check the statistical significance of the parameters in the output under Statistics, by checking the box Parameter Estimates.
    • This value for p is probably too high, because not all AR parameters are significant.
    • Run ARIMA again and decrease the number of AR parameters by leaving out the non-significant ones.
  2. By default SPSS saves the predicted values and 95% confidence limits (check the data file). We can now check how well the prediction is: Go to Graphs >> Legacy Dialogs >> Line. Select Multiple and Summaries of Separate Variables. Now enter TS_2, Fit_X, LCL_X and UCL_X in Lines Represent. X should be the number of the last (best) model you fitted, probably 2. Enter TIME as the Category Axis.

  3. In the simulation part of this course we have learned a very simple way to explore the dynamics of a system: The return plot. The time series is plotted against itself shifted by 1 step in time.
    • Create return plots (use a Scatterplot) for the three time series. Tip: You can easily create a t+1 version of the time series by using the LAG function in a COMPUTE statement. For instance:

      COMPUTE TS_1_lag1 = LAG(TS_1)
    • Are your conclusions about the time series the same as in 3. after interpreting these return plots?

| jump to solution |