For the benefit of those doing .NET coding over TSF, this post explains, how easy it is to programmatically get any properties from a trained state space model.

When you are dealing with TSF objects at .NET level do not treat a State Space Model (SSM) as a proverbial Big Black Box of Goo.

In fact an SSM object is very transparent and easy to understand.

In most scenarios you need two  or three  things from an SSM object:

o   Given a current state vector compute the next predicted value;

o   Given also prediction error, compute the next state;

o   (Occasionally)  find out how many parameters the model has.

All these things are defined at interface level. The method for the first one is AdvanceValue , the method for the second one is AdvanceState and the property for the last one is called ParameterCount.

## Exercise 1.  Smoothing out a time series using a trained model

The following code gets sequence of 1-step model forecasts (also known as “smoothing given the model”)

static List<double> QuickSmoothing(IList<double> timeSeriesData,

IList<double> modelParameters,

SparseVector initialState,

IScalarModelSolver model)

{

List<double> smoothing = new List<double>(timeSeriesData.Count);

SparseVector X = initialState; //Intialize the time series state

//Starting with the initial state, iterate through data

for (int i = 0; i < timeSeriesData.Count; i++)

{

//Generate next prediction given model and state

double yhat = model.AdvanceValue(modelParameters, i, X);

//Correct state based on current misprediction

double misPrediction = timeSeriesData[i] - yhat;

X = model.AdvanceState(modelParameters, i, X, misPrediction);

}

return smoothing;

}

## Exercise  2. Getting the error distribution of a trained model

You need the error distribution in order to evaluate the quality of the model and to estimate the forecast uncertainties. The code for that is strikingly similar to the previous exercise. Again AdvanceValue  and AdvanceState is all you need.

static IScalarDistribution QuickErrorDistribution(

IList<double> timeSeriesData,

IList<double> modelParameters,

SparseVector initialState,

IScalarModelSolver model)

{

List<double> errors = new List<double>(timeSeriesData.Count);

SparseVector X = initialState; //Intialize the time series state

//Starting with the initial state, iterate through data

for (int i = 0; i < timeSeriesData.Count; i++)

{

//Generate next prediction given model and state

double yhat = model.AdvanceValue(modelParameters, i, X);

double misPrediction = timeSeriesData[i] - yhat;

//Correct state based on current misprediction

X = model.AdvanceState(modelParameters, i, X, misPrediction);

}

return new NonparametricScalarDistribution(errors);

}

## Exercise 3. Computing the training score of a trained model

The following code computes an equivalent of BIC score of a model

static double QuickBIC(IList<double> timeSeriesData,

IList<double> modelParameters,

SparseVector initialState,

IScalarModelSolver model)

{

List<double> errors = new List<double>(timeSeriesData.Count);

SparseVector X = initialState; //Intialize the time series state

//Starting with the initial state, iterate through data

for (int i = 0; i < timeSeriesData.Count; i++)

{

//Generate next prediction given model and state

double yhat = model.AdvanceValue(modelParameters, i, X);

double misPrediction = timeSeriesData[i] - yhat;

//Correct state based on current misprediction

X = model.AdvanceState(modelParameters, i, X, misPrediction);

}

IScalarDistribution errorDist =

new NonparametricScalarDistribution(errors);

//Now compute the log-likelihood of the data given model

double LL = 0.0;

foreach (double error in errors)

{

LL += errorDist.LogPDF(error);

}

//Penalize for the number of parameters and return

double n = (double) timeSeriesData.Count;

return (LL - model.ParameterCount * Math.Log(n) / n);

}