
Answer-first summary for fast verification
Answer: 1. Use BigQuery to scale the numerical features. 2. Feed the features into Vertex AI Training. 3. Allow TensorFlow to perform the one-hot text encoding.
Option B is the correct answer because it provides a balanced approach to scaling numerical features and encoding categorical features while minimizing cost and effort. It suggests using BigQuery to scale the numerical features, which is simpler and cheaper than using TFX components with Dataflow. It then recommends feeding the features directly into Vertex AI Training and allowing TensorFlow to perform the one-hot encoding. This approach leverages TensorFlow's optimized implementation for one-hot encoding and reduces the complexity and overhead associated with other methods.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
No comments yet.
You are tasked with developing a custom classification model using TensorFlow, based on a large tabular dataset stored in BigQuery. This dataset contains hundreds of millions of rows and includes both categorical features, such as SKU names, and numerical features. As part of your preprocessing steps, you must apply a MaxMin scaler to some of the numerical features and use one-hot encoding for some of the categorical features. Your goal is to train this model over multiple epochs efficiently, while also minimizing the effort and cost associated with your solution. Given these requirements, what approach should you take?
A
B
C
D