
Answer-first summary for fast verification
Answer: Use MLflow to package the model and deploy it as a REST API endpoint.
The correct approach is to use MLflow for packaging the model and deploying it as a REST API endpoint. MLflow simplifies the deployment process by encapsulating the model, its dependencies, and necessary configurations, making it easier to manage in a production environment. This method supports real-time predictions through HTTP requests. Other options like exporting the model as a serialized file (Option C) or saving it to a Delta table (Option A) are less scalable or not suited for real-time predictions. Scheduling periodic runs (Option B) is more appropriate for batch processing rather than on-demand predictions.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
After successfully training and evaluating a machine learning model with Databricks MLlib, a data scientist is ready to deploy the model for real-time predictions in a production environment. What is the recommended approach?
A
Save the model to a Delta table and query it for predictions.
B
Schedule a Databricks Job to run the model periodically.
C
Export the model as a serialized file and deploy it on a separate server.
D
Use MLflow to package the model and deploy it as a REST API endpoint.
No comments yet.