
Answer-first summary for fast verification
Answer: Inference Tables
Inference Tables is the correct answer because it is specifically designed to automatically log and manage input/output records from model serving endpoints, including incoming requests and outgoing responses. This eliminates the need for a custom micro-service to handle logging. The community discussion shows 100% consensus on option D, with multiple comments explaining that Inference Tables provide built-in functionality for recording request/response data, monitoring model performance, and managing prediction results in Delta tables. Other options are unsuitable: Vector Search is for similarity search, Lakeview is for dashboards and data visualization, and DBSQL is for SQL query execution - none of these handle endpoint request/response logging.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
A Generative AI Engineer is using a provisioned throughput model serving endpoint in a RAG application and needs to monitor its incoming requests and outgoing responses. The current method involves a micro-service that sits between the endpoint and the user interface to log data to a remote server.
Which Databricks feature should be used to accomplish this same task without the micro-service?
A
Vector Search
B
Lakeview
C
DBSQL
D
Inference Tables
No comments yet.