I am confused what is occurring here.

We can remove the trend component in two easy steps. First, identify the overall trend by using the linear model function, lm

. The model should use the time series index for the x variable and the time series data for the y variable:

m <- lm(coredata(ts) ~ index(ts))

what is the index. The first time point is 1, the second time point is 2 etc?

Second, remove the linear trend from the original data by subtracting the straight line found by

lm

This is easy because we have access to the linear model’s residuals , which are defined by the difference between the original data and the fitted line:

where

r

i

is the

i

th residual and where

β

1

and

β

0

are the model’s slope and intercept, respectively.

We can extract the residuals from the linear model by using the

resid function and then embed the residuals inside a

zoo object:

>

detr <- zoo(resid(m), index(ts))

I don't understand what is occurring in the second step.

https://www.academia.edu/34707463/Detrending_a_Time_Series_Data

Elsewhere it is suggested you subtract a regression line created by regressing the time series on X and then subtracting this from the time series. I think this would give you the residuals, do you then use the residuals as the detrended time series? If you want to use this to predict future values, I assume you have to convert the residuals back after you use them to predict.

We can remove the trend component in two easy steps. First, identify the overall trend by using the linear model function, lm

. The model should use the time series index for the x variable and the time series data for the y variable:

m <- lm(coredata(ts) ~ index(ts))

what is the index. The first time point is 1, the second time point is 2 etc?

Second, remove the linear trend from the original data by subtracting the straight line found by

lm

This is easy because we have access to the linear model’s residuals , which are defined by the difference between the original data and the fitted line:

where

r

i

is the

i

th residual and where

β

1

and

β

0

are the model’s slope and intercept, respectively.

We can extract the residuals from the linear model by using the

resid function and then embed the residuals inside a

zoo object:

>

detr <- zoo(resid(m), index(ts))

I don't understand what is occurring in the second step.

https://www.academia.edu/34707463/Detrending_a_Time_Series_Data

Elsewhere it is suggested you subtract a regression line created by regressing the time series on X and then subtracting this from the time series. I think this would give you the residuals, do you then use the residuals as the detrended time series? If you want to use this to predict future values, I assume you have to convert the residuals back after you use them to predict.

Last edited: