
Asynchronous Hyperparameter Tuning with Successive Halving
Source:R/TunerAsyncSuccessiveHalving.R
mlr_tuners_async_successive_halving.Rd
OptimizerAsyncSuccessiveHalving
class that implements the Asynchronous Successive Halving Algorithm (ASHA).
This class implements the asynchronous version of OptimizerBatchSuccessiveHalving.
Source
Li L, Jamieson K, Rostamizadeh A, Gonina E, Ben-tzur J, Hardt M, Recht B, Talwalkar A (2020). “A System for Massively Parallel Hyperparameter Tuning.” In Dhillon I, Papailiopoulos D, Sze V (eds.), Proceedings of Machine Learning and Systems, volume 2, 230–246. https://proceedings.mlsys.org/paper_files/paper/2020/hash/a06f20b349c6cf09a6b171c71b88bbfc-Abstract.html.
Dictionary
This mlr3tuning::Tuner can be instantiated via the dictionary
mlr3tuning::mlr_tuners or with the associated sugar function mlr3tuning::tnr()
:
Subsample Budget
If the learner lacks a natural budget parameter, mlr3pipelines::PipeOpSubsample can be applied to use the subsampling rate as budget parameter. The resulting mlr3pipelines::GraphLearner is fitted on small proportions of the mlr3::Task in the first stage, and on the complete task in last stage.
Custom Sampler
Hyperband supports custom paradox::Sampler object for initial configurations in each bracket. A custom sampler may look like this (the full example is given in the examples section):
Parameters
eta
numeric(1)
With every stage, the budget is increased by a factor ofeta
and only the best1 / eta
configurations are promoted to the next stage. Non-integer values are supported, buteta
is not allowed to be less or equal to 1.sampler
paradox::Sampler
Object defining how the samples of the parameter space should be drawn. The default is uniform sampling.
Archive
The bbotk::Archive holds the following additional columns that are specific to SHA:
stage
(integer(1))
Stage index. Starts counting at 0.asha_id
(character(1))
Unique identifier for each configuration across stages.
Super classes
mlr3tuning::Tuner
-> mlr3tuning::TunerAsync
-> mlr3tuning::TunerAsyncFromOptimizerAsync
-> TunerAsyncSuccessiveHalving