
Answer-first summary for fast verification
Answer: learning_rate, num_hidden_layers
The `learning_rate` is a critical hyperparameter that controls how much to change the model in response to the estimated error each time the model weights are updated. The `num_hidden_layers` parameter defines the architecture of your neural network, influencing its capacity to learn from data. Both are essential for hyperparameter tuning to optimize model performance. Options A (`scaleTier`) and C (`parameterServerType`) relate to infrastructure configuration, not hyperparameter tuning. Option E (`batch_size`), while important for training, is not typically specified as a hyperparameter for tuning in this context. For more details, refer to the GCP documentation on hyperparameter tuning.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
Your team is developing a custom Deep Neural Network model using TensorFlow on AI Platform to predict medical diagnoses from diagnostic images. This task is complex and resource-intensive, requiring optimization for both accuracy and computational efficiency. Given the constraints of limited computational resources and the need for rapid iteration, you decide to utilize GCP's hyperparameter tuning feature to optimize your model. Which two parameters must you specify to effectively guide the hyperparameter tuning process towards achieving the best model performance under these constraints? (Choose two)
A
scaleTier
B
learning_rate
C
parameterServerType
D
num_hidden_layers
E
batch_size
No comments yet.