
Answer-first summary for fast verification
Answer: Add credentials using environment variables
The correct answer is C because environment variables provide a secure method to pass credentials and secrets to MLflow Pyfunc models without hardcoding them in the code. This approach follows security best practices by keeping sensitive information separate from the application logic and preventing exposure in version control or logs. Option A (spark.conf.set) is unsuitable as it's for Spark configuration, not endpoint credential management. Option B (Databricks Feature Store API) is incorrect as it's for feature management, not credential passing. Option D (plain text) is a security anti-pattern that exposes credentials and should never be used.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
How should a Generative AI Engineer configure a serving endpoint to pass secrets and credentials when deploying a custom MLflow Pyfunc model that returns intermediate results?
A
Use spark.conf.set ()
B
Pass variables using the Databricks Feature Store API
C
Add credentials using environment variables
D
Pass the secrets in plain text
No comments yet.