
Ultimate access to all questions.
Deep dive into the quiz with AI chat providers.
We prepare a focused prompt with your quiz and certificate details so each AI can offer a more tailored, in-depth explanation.
A company wants to use large language models (LLMs) with Amazon Bedrock to develop a chat interface for the company's product manuals. The manuals are stored as PDF files. Which solution meets these requirements MOST cost-effectively?
A
Use prompt engineering to add one PDF file as context to the user prompt when the prompt is submitted to Amazon Bedrock.
B
Use prompt engineering to add all the PDF files as context to the user prompt when the prompt is submitted to Amazon Bedrock.
C
Use all the PDF documents to fine-tune a model with Amazon Bedrock. Use the fine-tuned model to process user prompts.
D
Upload PDF documents to an Amazon Bedrock knowledge base. Use the knowledge base to provide context when users submit prompts to Amazon Bedrock.
Explanation:
Correct Answer: D - Upload PDF documents to an Amazon Bedrock knowledge base. Use the knowledge base to provide context when users submit prompts to Amazon Bedrock.
Cost-Effectiveness: Amazon Bedrock knowledge bases are designed specifically for Retrieval-Augmented Generation (RAG) applications. They provide an efficient way to retrieve relevant information from documents without needing to include entire documents in every prompt.
Efficient Context Management: Knowledge bases automatically:
Scalability: As the number of PDF files grows, knowledge bases handle the scaling efficiently without increasing costs linearly.
Option A & B (Prompt Engineering with PDF context):
Option C (Fine-tuning a model):
This approach provides the best balance of accuracy, performance, and cost for a chat interface with product manuals.