omnixai.explainers.timeseries.agnostic package

shap

The SHAP explainer for time series tasks.

omnixai.explainers.timeseries.agnostic.shap module

The SHAP explainer for time series tasks.

class omnixai.explainers.timeseries.agnostic.shap.ShapTimeseries(training_data, predict_function, mode='anomaly_detection', **kwargs)

Bases: ExplainerBase

The SHAP explainer for time series forecasting and anomaly detection. If using this explainer, please cite the original work: https://github.com/slundberg/shap.

Parameters
  • training_data (Timeseries) – The data used to initialize the explainer.

  • predict_function (Callable) – The prediction function corresponding to the model to explain. The input of predict_function is an Timeseries instance. The output of predict_function is the anomaly score (higher scores imply more anomalous) for anomaly detection or the predicted value for forecasting.

  • mode (str) – The task type, e.g., anomaly_detection or forecasting.

explanation_type = 'local'
alias = ['shap']
explain(X, **kwargs)

Generates the feature-importance explanations for the input instances.

Parameters
  • X (Timeseries) – An instance of Timeseries representing one input instance or a batch of input instances.

  • kwargs – Additional parameters for shap.KernelExplainer.shap_values, e.g., “nsamples” for the number of times to re-evaluate the model when explaining each prediction.

Return type

FeatureImportance

Returns

The feature-importance explanations for all the input instances.