
Answer-first summary for fast verification
Answer: Use feature crosses
Given the scenario with numerous training instances and limited features, introducing synthetic features through feature crosses is a viable strategy to improve model performance. Techniques like dropout are aimed at preventing overfitting, not underfitting. Backpropagation is utilized for parameter adjustment in neural networks, while gradient descent is an optimization algorithm. For more insights, visit [Google's Machine Learning Crash Course on Feature Crosses](https://developers.google.com/machine-learning/crash-course/feature-crosses/encoding-nonlinearity).
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
No comments yet.
You're working on a deep learning model that has a small number of features but a large number of instances. Despite your efforts, the model's performance is below expectations, and you suspect it's underfitting. Which technique would you consider to enhance its performance?
A
Use backpropagation
B
Use feature crosses
C
Use gradient descent
D
AI