Time Series based Local Black Box Explainers

Time Series Individual Conditional Expectation (TSICE) Explainer

class aix360.algorithms.tsice.tsice.TSICEExplainer(forecaster: Callable, input_length: int, forecast_lookahead: int, n_variables: int = 1, n_exogs: int = 0, n_perturbations: int = 25, features_to_analyze: List[str] = None, perturbers: List[Union[aix360.algorithms.tsutils.tsperturbers.tsperturber.TSPerturber, dict]] = None, explanation_window_start: int = None, explanation_window_length: int = 10)

TSICEExplainer extends the Individual Conditional Expectation for correlated timeseries data (higher dimensions). It uses TSFeatures to derive time series structural features, and uses data perturber(TSPerturber) for generating simulated data. TSICEExplainer explains the trend in the model forecast change with time series derived features.

References

[1]Goldstein et al. ‘Peeking Inside the Black Box: Visualizing Statistical Learning with Plots of Individual Conditional Expectation’

Initializer for TSICEExplainer

Parameters:
  • forecaster (Callable) – Callable object produces a forecast as numpy array for a given input as numpy array.
  • input_length (int) – Input length for the forecaster.
  • forecast_lookahead (int) – Lookahead length of the forecaster prediction.
  • n_variables (int) – Number of variables in the forecaster input. Defaults to 1.
  • n_exogs (int) – Number of exogenous variable required for the forecaster. Defaults to 0.
  • n_perturbations (int) – Number of perturbed instance for TSExplanation. Defaults to 25.
  • features_to_analyze (List[str]) – List of features used to analyze the perturbed timeseries during TSICE explanation. As the observing timeseries is complicated, these set of features can be used to observe the perturbations closer. Allowed values are “median”, “mean”, “min”, “max”, “std”, “range”, “intercept”, “trend”, “rsquared”, “max_variation”. If None, “mean” is used by default. Defaults to None.
  • perturbers (List[TSPerturber, dict]) – data perturbation algorithm specification by TSPerturber instance or dict. Allowed values for “type” key in dictionary are block-bootstrap, frequency, moving-average, shift. Block-bootstrap split the time series into contiguous chunks called blocks, for each block noise is estimated and noise is exchanged and added to the signal between randomly selected blocks. Moving-average perturbation maintains the moving mean of the time series data with the specified window length, but add perturbed noise with similar distribution as the data. Frequency perturber performs FFT on the noise, and removes random high frequency components from the noise estimates. Number of frequencies to be removed is specified by the truncate_frequencies argument. Shift perturber adds random upward or downward shift in the data value over time continuous blocks. If not provided default perturber is combination of block-bootstrap, moving-average, and frequency. Defaults to None.
  • explanation_window_start (int) – Explanation window is selected from the input timeseries starting explanation_window_start. This window is used to select the part of the timeseries for TSICE analysis. Perturbations are computed over this explanation window. If explanation_window_start is None, explanation_window is selected from recent upto explanation_window_length. Defaults to None.
  • explanation_window_length (int) – Explanation window of length: explanation_window_length is selected from the input timeseries. This window is used to select the part of the timeseries for TSICE analysis. Perturbations are computed over this explanation window. Defaults to 10.
explain_instance(ts: aix360.algorithms.tsutils.tsframe.tsFrame, ts_related: aix360.algorithms.tsutils.tsframe.tsFrame = None, **explain_params)

Explain the forecast made by the forecaster at a certain point in time (local explanation).

Parameters:
  • ts (tsFrame) – The future univariate time series tsFrame to use for forecasting that extends forward from the end of training tsFrame (ts_train) or timestamp_start for the requested number of periods. This can be generated using aix360.algorithms.tsframe.tsFrame. A tsFrame is a pandas DataFrame indexed by Timestamp objects (that is DatetimeIndex). Each column corresponds to a target to forecast.
  • ts_related (tsFrame, optional) – The related time series tsFrame containing the external regressors. A tsFrame is a pandas DataFrame indexed by Timestamp objects (that is DatetimeIndex). Each column corresponds to a related external regressor. Defaults to None.
  • explain_params – Arbitrary explainer parameters.
Returns:

explanation object

Dictionary with data_x, feature_names, feature_values, signed_impact, total_impact, current_forecast, current_feature_values, perturbations and forecasts_on_perturbations.

Return type:

dict

set_params(*argv, **kwargs)

Set parameters for the explainer.

Time Series Saliency (TSSaliency) Explainer

class aix360.algorithms.tssaliency.tssaliency.TSSaliencyExplainer(model: Callable, input_length: int, feature_names: List[str], base_value: List[float] = None, n_samples: int = 50, gradient_samples: int = 25, gradient_function: Callable = None, random_seed: int = 22)

Time Series Saliency (TSSaliency) Explainer is a model agnostic saliency explainer for time series associate tasks. The TSSaliency supports univariate and multivariate use cases. It explains temporal importance of different variates on the model prediction. TSSaliency incorporates an integrated gradient method for saliency estimation. The saliency measure involves the notion of a base value. For example, the base value can be the constant signal with average value. The saliency measure is computed by integrating the model sensitivity over a trajectory from the base value to the time series signal. The TSSaliency explainer provides variate wise contributions to model prediction at a temporal resolution.

References

[2]Mukund Sundararajan et al. “Axiomatic Attribution for Deep Networks”

Initializer for TSSaliencyExplainer

Parameters:
  • model (Callable) – model prediction (predict/predict_proba) function that results a real value like probability or regressed value. This function must accept numpy array of shape (input_length x len(feature_names)) as input and result in numpy array of shape (1, -1).
  • input_length (int) – length of history window used in model training.
  • feature_names (List[str]) – list of feature names in the input data.
  • base_value (List[float]) – base value to be used in saliency computation. The computed gradients are with respect to this base value. If None, mean value is used. Defaults to None.
  • n_samples (int) – number of path samples to be created for each input instance while computing saliency metric. Defaults to 50.
  • gradient_samples (int) – number of timeseries samples to be generated while computing integreated gradient on the input data. Defaults to 25.
  • gradient_function (Callable) – gradient function to be used in saliency (integrated gradient) computation. If None, mc_gradient_compute is used. Defaults to None.
  • random_seed (int) – random seed to get consistent results. Refer to numpy random state. Defaults to 22.
explain_instance(ts: aix360.algorithms.tsutils.tsframe.tsFrame, **explain_params)

Explain the prediction made by the time series model at a certain point in time (local explanation).

Parameters:
  • ts (tsFrame) – Input time series signal in tsFrame format. This can be generated using aix360.algorithms.tsframe.tsFrame. A tsFrame is a pandas DataFrame indexed by Timestamp objects (that is DatetimeIndex). Each column corresponds to an input feature.
  • explain_params – Arbitrary explainer parameters.
Returns:

explanation object

Dictionary with input_data, saliency, feature_names, timestamps, base_value, instance_prediction, base_value_prediction.

Return type:

dict

get_params(*argv, **kwargs) → dict

Get parameters for the explainer.

set_params(*argv, **kwargs)

Set parameters for the explainer.

Time Series Local Interpretable Model-agnostic Explainer (TSLime)

class aix360.algorithms.tslime.tslime.TSLimeExplainer(model: Callable, input_length: int, n_perturbations: int = 2000, relevant_history: int = None, perturbers: List[Union[aix360.algorithms.tsutils.tsperturbers.tsperturber.TSPerturber, dict]] = None, local_interpretable_model: aix360.algorithms.tslime.surrogate.LinearSurrogateModel = None, random_seed: int = None)

Time Series Local Interpretable Model-agnostic Explainer (TSLime) is a model-agnostic local time series explainer. LIME (Locally interpretable Model agnostic explainer) is a popular algorithm for local explanation. LIME explains the model behavior by approximating the model response with linear models. LIME algorithm specifically assumes tabular data format, where each row is a data point, and columns are features. A generalization of LIME algorithm for image data uses super pixel based perturbation. TSLIME generalizes LIME algorithm for time series context.

TSLIME uses time series perturbation methods to produce a local input perturbation, and linear model surrogate which best approximates the model response. TSLime produces an interpretable explanation. The explanation weights produced by the TSLime explanation indicates model local sensitivity.

References

[3]Ribeiro et al. ‘“Why Should I Trust You?”: Explaining the Predictions of Any Classifier’

Initializer for TSLimeExplainer

Parameters:
  • model (Callable) – Callable object produces a prediction as numpy array for a given input as numpy array.
  • input_length (int) – Input (history) length used for input model.
  • n_perturbations (int) – Number of perturbed instance for TSExplanation. Defaults to 25.
  • relevant_history (int) – Interested window size for explanations. The explanation is computed for selected latest window of length relevant_history. If input_length=20 and relevant_history=10, explanation is computed for last 10 time points. If None, relevant_history is set to input_length. Defaults to None.
  • perturbers (List[TSPerturber, dict]) – data perturbation algorithm specification by TSPerturber instance or dict. Allowed values for “type” key in dictionary are block-bootstrap, frequency, moving-average, shift. Block-bootstrap split the time series into contiguous chunks called blocks, for each block noise is estimated and noise is exchanged and added to the signal between randomly selected blocks. Moving-average perturbation maintains the moving mean of the time series data with the specified window length, but add perturbed noise with similar distribution as the data. Frequency perturber performs FFT on the noise, and removes random high frequency components from the noise estimates. Number of frequencies to be removed is specified by the truncate_frequencies argument. Shift perturber adds random upward or downward shift in the data value over time continuous blocks. If not provided default perturber is block-bootstrap. Defaults to None.
  • local_interpretable_model (LinearSurrogateModel) – Local interpretable model, a surrogate that is to be trained on the given input time series neighborhood. This model is used to provide local weights for each time point in the selected timeseries. If None, sklearn’s Linear Regression model, aix360.algorithms.tslime.surrogate.LinearRegressionSurrogate is used. Defaults to None.
  • random_seed (int) – random seed to get consistent results. Refer to numpy random state. Defaults to None.
explain_instance(ts: aix360.algorithms.tsutils.tsframe.tsFrame, **explain_params)

Explain the prediction made by the time series model at a certain point in time (local explanation).

Parameters:
  • ts (tsFrame) – Input time series signal in tsFrame format. This can be generated using aix360.algorithms.tsframe.tsFrame. A tsFrame is a pandas DataFrame indexed by Timestamp objects (that is DatetimeIndex). Each column corresponds to an input feature.
  • explain_params – Arbitrary explainer parameters.
Returns:

explanation object

Dictionary with keys: input_data, history_weights, model_prediction, surrogate_prediction, x_perturbations, y_perturbations.

Return type:

dict

set_params(*argv, **kwargs)

Set parameters for the explainer.