Other Hyperparameter Estimation Algorithms

Hyperband:

The underlying principle of this algorithm is that if a hyperparameter configuration is destined to be the best after a large number of iterations, it is more likely to perform in the top half of configurations after a small number of iterations. Below is step-by-step implementation of Hyperband.

  • Randomly sample n number of hyperparameter sets in the search space.
  • After k iterations evaluate the validation loss of these hyperpameters.
  • Discard the half of lowest performing hyperparameters .
  • Run the good ones for k iterations more and evaluate and discard the bottom half.
  • Repeat until we have only one model of hyperparameter left.

Drawbacks:

If number of samples is large some good performing hyperparameter sets which required some time  to converge may be discarded early in the optimization.

Population based Training (PBT):

Population based Training (PBT) starts similar to random based training by training many models in parallel. But rather than the networks training independently, it uses information from the remainder of the population to refine the hyperparameters and direct computational resources to models which show promise. This takes its inspiration from genetic algorithms where each member of the population, referred to as a worker, can exploit information from the rest of the population. for instance, a worker might copy the model parameters from a far better performing worker. It also can explore new hyperparameters by changing the present values randomly.

Bayesian Optimization and HyperBand (BOHB):

BOHB (Bayesian Optimization and HyperBand) is a combination of the Hyperband algorithm and Bayesian optimization. First, it uses Hyperband capability to sample many configurations with a small budget to explore quickly and efficiently the hyper-parameter search space and get very soon promising configurations, then it uses the Bayesian optimizer predictive capability to propose  set of hyperparameters that are close to optimum.This algorithm can also be run in parallel (as Hyperband) which overcomes a strong drawback of Bayesian optimization.

Hyperparameters Optimization methods – ML

In this article, we will discuss the various hyperparameter optimization techniques and their major drawback in the field of machine learning.

What are the Hyperparameters?

Hyperparameters are those parameters that we set for training. Hyperparameters have major impacts on accuracy and efficiency while training the model. Therefore it needed to be set accurately to get better and efficient results. Let’s discuss some Hyperparameters Optimization Methods to optimize the hyperparameter.

Similar Reads

Hyperparameters Optimization Technique

Exhaustive Search Methods...

Other Hyperparameter Estimation Algorithms:

Hyperband:...

Frequently Asked Questions (FAQs)

Q. 1 What are hyperparameters in machine learning?...