ids defaults to finished jobs (as reported by
If a job threw an error, is expired or is still running, it will be ignored with this default.
Just leaving these jobs out in an analysis is not statistically sound.
Instead, try to robustify your jobs by using a fallback learner (c.f. mlr3::Learner).
reduceResultsBatchmark( ids = NULL, store_backends = TRUE, reg = batchtools::getDefaultRegistry() )
with a column named “job.id”.
Alternatively, you may also pass a vector of integerish job ids.
If not set, defaults to the return value of
Invalid ids are ignored.
Keep the DataBackend of the Task in the ResampleResult? Set to
TRUE if your performance measures require a Task,
or to analyse results more conveniently.
FALSE to reduce the file size and memory footprint
The current default is
TRUE, but this eventually will be changed
in a future release.