Calculate Loss Functions

loss_cross_entropy(observed, predicted, p_min = 1e-04, na.rm = TRUE)

loss_sum_of_squares(observed, predicted, na.rm = TRUE)

loss_root_mean_square(observed, predicted, na.rm = TRUE)

loss_accuracy(observed, predicted, na.rm = TRUE)

loss_one_minus_accuracy(observed, predicted, cutoff = 0.5, na.rm = TRUE)

get_loss_one_minus_accuracy(cutoff = 0.5, na.rm = TRUE)

loss_one_minus_auc(observed, predicted)

get_loss_default(x)

loss_default(x)

Arguments

observed

observed scores or labels, these are supplied as explainer specific y

predicted

predicted scores, either vector of matrix, these are returned from the model specific predict_function()

p_min

for cross entropy, minimal value for probability to make sure that log will not explode

na.rm

logical, should missing values be removed?

cutoff

classification threshold for the accuracy loss functions

x

either an explainer or type of the model. One of "regression", "classification", "multiclass".

Value

numeric - value of the loss function

Examples

 # \donttest{
library("ranger")
titanic_ranger_model <- ranger(survived~., data = titanic_imputed, num.trees = 50,
                               probability = TRUE)
loss_one_minus_auc(titanic_imputed$survived, yhat(titanic_ranger_model, titanic_imputed))
#> [1] 0.1036886

HR_ranger_model_multi <- ranger(status~., data = HR, num.trees = 50, probability = TRUE)
loss_cross_entropy(as.numeric(HR$status), yhat(HR_ranger_model_multi, HR))
#> [1] 2979.306

 # }