Interface to different implementations of the LIME method. Find information how the LIME method works here: https://ema.drwhy.ai/LIME.html.

predict_surrogate(explainer, new_observation, ..., type = "localModel")

predict_surrogate_local_model(
  explainer,
  new_observation,
  size = 1000,
  seed = 1313,
  ...
)

predict_model.dalex_explainer(x, newdata, ...)

model_type.dalex_explainer(x, ...)

predict_surrogate_lime(
  explainer,
  new_observation,
  n_features = 4,
  n_permutations = 1000,
  labels = unique(explainer$y)[1],
  ...
)

# S3 method for predict_surrogate_lime
plot(x, ...)

predict_surrogate_iml(explainer, new_observation, k = 4, ...)

Arguments

explainer

a model to be explained, preprocessed by the 'explain' function

new_observation

a new observation for which predictions need to be explained

...

other parameters that will be passed to

type

which implementation of thee LIME method should be used. Either localModel (default), lime or iml.

size

will be passed to the localModel implementation, by default 1000

seed

seed for random number generator, by default 1313

x

an object to be plotted

newdata

alias for new_observation

n_features

will be passed to the lime implementation, by default 4

n_permutations

will be passed to the lime implementation, by default 1000

labels

will be passed to the lime implementation, by default first value in the y vector

k

will be passed to the iml implementation, by default 4

Value

Depending on the type there are different classess of the resulting object.

References

Explanatory Model Analysis. Explore, Explain and Examine Predictive Models. https://ema.drwhy.ai/