Skip to contents

TunerSuccessiveHalving class that implements the successive halving algorithm (SHA). SHA randomly samples n candidate hyperparameter configurations and allocates a minimum budget (r_min) to all candidates. The candidates are raced down in stages to a single best candidate by repeatedly increasing the budget by a factor of eta and promoting only the best 1 / eta fraction to the next stage. This means promising hyperparameter configurations are allocated a higher budget overall and lower performing ones are discarded early on.

The budget hyperparameter must be tagged with "budget" in the search space. The minimum budget (r_min) which is allocated in the base stage, is set by the lower bound of the budget parameter. The upper bound defines the maximum budget (r_max) which is allocated to the candidates in the last stage. The number of stages is computed so that each candidate in base stage is allocated the minimum budget and the candidates in the last stage are not evaluated on more than the maximum budget. The following table is the stage layout for eta = 2, r_min = 1 and r_max = 8.

in_ir_i
081
142
224
318

i is stage number, n_i is the number of configurations and r_i is the budget allocated to a single configuration.

Source

Jamieson K, Talwalkar A (2016). “Non-stochastic Best Arm Identification and Hyperparameter Optimization.” In Gretton A, Robert CC (eds.), Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, volume 51 series Proceedings of Machine Learning Research, 240-248. http://proceedings.mlr.press/v51/jamieson16.html.

Subsample Budget

If the learner lacks a natural budget parameter, mlr3pipelines::PipeOpSubsample can be applied to use the subsampling rate as budget parameter. The resulting mlr3pipelines::GraphLearner is fitted on small proportions of the mlr3::Task in the first stage, and on the complete task in last stage.

Parameters

n

integer(1)
Number of candidates in base stage.

eta

numeric(1)
With every stage, the budget is increased by a factor of eta and only the best 1 / eta candidates are promoted to the next stage. Non-integer values are supported, but eta is not allowed to be less or equal 1.

sampler

paradox::Sampler
Object defining how the samples of the parameter space should be drawn. The default is uniform sampling.

repeats

logical(1)
If FALSE (default), SHA terminates once all stages are evaluated. Otherwise, SHA starts over again once the last stage is evaluated.

adjust_minimum_budget

logical(1)
If TRUE, minimum budget is increased so that the last stage uses the maximum budget defined in the search space.

Archive

The mlr3tuning::ArchiveTuning holds the following additional columns that are specific to the successive halving algorithm:

  • stage (integer(1))
    Stage index. Starts counting at 0.

  • repetition (integer(1))
    Repetition index. Start counting at 1.

Custom Sampler

Hyperband supports custom paradox::Sampler object for initial configurations in each bracket. A custom sampler may look like this (the full example is given in the examples section):

# - beta distribution with alpha = 2 and beta = 5
# - categorical distribution with custom probabilities
sampler = SamplerJointIndep$new(list(
  Sampler1DRfun$new(params[[2]], function(n) rbeta(n, 2, 5)),
  Sampler1DCateg$new(params[[3]], prob = c(0.2, 0.3, 0.5))
))

Progress Bars

$optimize() supports progress bars via the package progressr combined with a Terminator. Simply wrap the function in progressr::with_progress() to enable them. We recommend to use package progress as backend; enable with progressr::handlers("progress").

Parallelization

The hyperparameter configurations of one stage are evaluated in parallel with the future package. To select a parallel backend, use future::plan().

Logging

Hyperband uses a logger (as implemented in lgr) from package bbotk. Use lgr::get_logger("bbotk") to access and control the logger.

Super classes

mlr3tuning::Tuner -> mlr3tuning::TunerFromOptimizer -> TunerSuccessiveHalving

Methods

Inherited methods


Method new()

Creates a new instance of this R6 class.


Method clone()

The objects of this class are cloneable with this method.

Usage

TunerSuccessiveHalving$clone(deep = FALSE)

Arguments

deep

Whether to make a deep clone.

Examples

if(requireNamespace("xgboost")) {
  library(mlr3learners)

  # define hyperparameter and budget parameter
  search_space = ps(
    nrounds = p_int(lower = 1, upper = 16, tags = "budget"),
    eta = p_dbl(lower = 0, upper = 1),
    booster = p_fct(levels = c("gbtree", "gblinear", "dart"))
  )

  # \donttest{
  # hyperparameter tuning on the pima indians diabetes data set
  instance = tune(
    method = "successive_halving",
    task = tsk("pima"),
    learner = lrn("classif.xgboost", eval_metric = "logloss"),
    resampling = rsmp("cv", folds = 3),
    measures = msr("classif.ce"),
    search_space = search_space,
    term_evals = 100
  )

  # best performing hyperparameter configuration
  instance$result
  # }
}
#>    nrounds       eta booster learner_param_vals  x_domain classif.ce
#> 1:       8 0.3942863  gbtree          <list[6]> <list[3]>  0.2317708