
Answer-first summary for fast verification
Answer: Configure Vertex AI Vector Search as the search platform’s backend.
The question requires building a proof-of-concept search platform quickly to handle complex semantic queries involving movie metadata. Vertex AI Vector Search (Option B) is the optimal choice because it is specifically designed for semantic search scenarios, provides fast and efficient vector-based similarity matching, and can be deployed rapidly without extensive model training. The community discussion strongly supports this with 100% consensus and 4 upvotes, noting that vector search is more efficient for search-based queries compared to LLMs (Option A) which are better for conversational applications, BERT-based models (Option C) which require unnecessary complexity and training, and Vertex AI Agent Builder (Option D) which is geared toward conversational agents with higher latency.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
You work for a media company with a streaming movie platform. The current search uses simple keyword matching, but you are now seeing more complex semantic queries that include movie metadata like actors, genre, and director. You need to build a proof-of-concept for a revamped search solution that delivers better results as quickly as possible. How should you build this new search platform?
A
Use a foundational large language model (LLM) from Model Garden as the search platform’s backend.
B
Configure Vertex AI Vector Search as the search platform’s backend.
C
Use a BERT-based model and host it on a Vertex AI endpoint.
D
Create the search platform through Vertex AI Agent Builder.
No comments yet.