
Answer-first summary for fast verification
Answer: Random search can be more efficient than grid search, especially when some hyperparameters are more important than others, but it is less efficient than Bayesian optimization.
Random search can be more efficient than grid search, especially when some hyperparameters are more important than others. It does not waste evaluations on unpromising areas of the parameter space. However, it is generally less efficient than Bayesian optimization, which uses a surrogate model and an acquisition function to guide the search. Random search might be the preferred choice in scenarios where the parameter space is large and the computational cost of evaluating each set of hyperparameters is low.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
Describe the process of hyperparameter tuning using random search and discuss the advantages and disadvantages of this method compared to other tuning methods like grid search and Bayesian optimization. Provide examples of scenarios where random search might be the preferred choice.
A
Random search is always inferior to grid search and Bayesian optimization and should not be used.
B
Random search can be more efficient than grid search, especially when some hyperparameters are more important than others, but it is less efficient than Bayesian optimization.
C
Random search is identical to grid search but with a different name.
D
Random search is only used for tuning hyperparameters of deep learning models, not other types of models.
No comments yet.