
Google Professional Machine Learning Engineer
Get started today
Ultimate access to all questions.
You are tasked with developing a fraud detection model using Keras and TensorFlow. The records of customer transactions, which serve as your dataset, are stored in a large table in BigQuery. Before training your model, you need to preprocess these records in a way that is both cost-effective and efficient. Additionally, the ultimate goal is to use the trained model for batch inference directly in BigQuery. Considering these requirements, how should you implement the preprocessing workflow?
You are tasked with developing a fraud detection model using Keras and TensorFlow. The records of customer transactions, which serve as your dataset, are stored in a large table in BigQuery. Before training your model, you need to preprocess these records in a way that is both cost-effective and efficient. Additionally, the ultimate goal is to use the trained model for batch inference directly in BigQuery. Considering these requirements, how should you implement the preprocessing workflow?
Exam-Like