facet.explanation.parallel.ExplainerJob#

class facet.explanation.parallel.ExplainerJob(explainer, X, y=None, *, interactions, **kwargs)[source]#

A call to an explanation function with given X and y values.

Bases

Job [Union [ndarray [Any, dtype [Any]], List [ndarray [Any, dtype [Any]]]]]

Metaclasses

ABCMeta

Parameters
  • explainer (BaseExplainer) – the SHAP explainer to use

  • X (Union[ndarray[Any, dtype[Any]], DataFrame, Pool]) – the feature values of the observations to be explained

  • y (Union[ndarray[Any, dtype[Any]], Series, None]) – the target values of the observations to be explained

  • interactions (bool) – if False, calculate SHAP values; if True, calculate SHAP interaction values

  • kwargs (Any) – additional arguments specific to the explanation method

Method summary

delayed

See pytools.parallelization.Job.delayed()

run

Run this job.

Attribute summary

explainer

the SHAP explainer to use

interactions

if False, calculate SHAp values; otherwise, calculate SHAP interaction values

X

the feature values of the observations to be explained

y

the target values of the observations to be explained

kwargs

additional arguments specific to the explanation method

Definitions

classmethod delayed(function)#

See pytools.parallelization.Job.delayed()

run()[source]#

Run this job.

Return type

Union[ndarray[Any, dtype[Any]], List[ndarray[Any, dtype[Any]]]]

Returns

the result produced by the job

X: Union[numpy.ndarray[Any, numpy.dtype[Any]], pandas.DataFrame, facet.explanation.Pool]#

the feature values of the observations to be explained

explainer: facet.explanation.base.BaseExplainer#

the SHAP explainer to use

interactions: bool#

if False, calculate SHAp values; otherwise, calculate SHAP interaction values

kwargs: Dict[str, Any]#

additional arguments specific to the explanation method

y: Optional[Union[numpy.ndarray[Any, numpy.dtype[Any]], pandas.Series]]#

the target values of the observations to be explained