Skip to contents

OptimizerAsyncSuccessiveHalving class that implements the Asynchronous Successive Halving Algorithm (ASHA). This class implements the asynchronous version of OptimizerBatchSuccessiveHalving.

Source

Li L, Jamieson K, Rostamizadeh A, Gonina E, Ben-tzur J, Hardt M, Recht B, Talwalkar A (2020). “A System for Massively Parallel Hyperparameter Tuning.” In Dhillon I, Papailiopoulos D, Sze V (eds.), Proceedings of Machine Learning and Systems, volume 2, 230–246. https://proceedings.mlsys.org/paper_files/paper/2020/hash/a06f20b349c6cf09a6b171c71b88bbfc-Abstract.html.

Dictionary

This bbotk::Optimizer can be instantiated via the dictionary bbotk::mlr_optimizers or with the associated sugar function bbotk::opt():

mlr_optimizers$get("async_successive_halving")
opt("async_successive_halving")

Parameters

eta

numeric(1)
With every stage, the budget is increased by a factor of eta and only the best 1 / eta configurations are promoted to the next stage. Non-integer values are supported, but eta is not allowed to be less or equal to 1.

sampler

paradox::Sampler
Object defining how the samples of the parameter space should be drawn. The default is uniform sampling.

Archive

The bbotk::Archive holds the following additional columns that are specific to SHA:

  • stage (integer(1))
    Stage index. Starts counting at 0.

  • asha_id (character(1))
    Unique identifier for each configuration across stages.

Custom Sampler

Hyperband supports custom paradox::Sampler object for initial configurations in each bracket. A custom sampler may look like this (the full example is given in the examples section):

# - beta distribution with alpha = 2 and beta = 5
# - categorical distribution with custom probabilities
sampler = SamplerJointIndep$new(list(
  Sampler1DRfun$new(params[[2]], function(n) rbeta(n, 2, 5)),
  Sampler1DCateg$new(params[[3]], prob = c(0.2, 0.3, 0.5))
))

Super classes

bbotk::Optimizer -> bbotk::OptimizerAsync -> OptimizerAsyncSuccessiveHalving

Methods

Inherited methods


Method new()

Creates a new instance of this R6 class.


Method optimize()

Performs the optimization on a OptimInstanceAsyncSingleCrit or OptimInstanceAsyncMultiCrit until termination. The single evaluations will be written into the ArchiveAsync. The result will be written into the instance object.

Usage

OptimizerAsyncSuccessiveHalving$optimize(inst)

Arguments

inst

(OptimInstanceAsyncSingleCrit | OptimInstanceAsyncMultiCrit).


Method clone()

The objects of this class are cloneable with this method.

Usage

OptimizerAsyncSuccessiveHalving$clone(deep = FALSE)

Arguments

deep

Whether to make a deep clone.