Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature Request: Out-Of-Bag Error should not require storing the model. #1158

Open
sebffischer opened this issue Sep 3, 2024 · 0 comments
Open

Comments

@sebffischer
Copy link
Member

It is currently necessary to store the $model of a learner in order to access its oob_error.
An example for this is the random forest:

library(mlr3verse)
rr = resample(tsk("iris"), rsmp("classif.ranger"), rsmp("holdout"), store_models = TRUE)
rr$aggregate(msr("oob_error"))

The code below will err if we don't specify store_models = TRUE because the out-of-bag error active binding extracts this information from learner$model, which will fail of the model is not stored.

Instead of making the out-of-bag error active binding access the learner's $model, we can instead support the private function $.extract_oob_error() which will add the out-of-bag error score to the learner's $state, which is even accessible when store_models = FALSE.
This is already implemented for internal validation scores and internal tuned values:

mlr3/R/worker.R

Lines 94 to 100 in 5ffcfee

if (!is.null(validate)) {
learner$state$internal_valid_scores = get_private(learner)$.extract_internal_valid_scores()
learner$state$internal_valid_task_hash = task$internal_valid_task$hash
}
if (exists(".extract_internal_tuned_values", get_private(learner))) {
learner$state$internal_tuned_values = get_private(learner)$.extract_internal_tuned_values()

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants