
Ultimate access to all questions.
A company wants to use large language models (LLMs) with Amazon Bedrock to develop a chat interface for the company's product manuals. The manuals are stored as PDF files. Which solution meets these requirements MOST cost-effectively?
Explanation:
Correct Answer: D - Upload PDF documents to an Amazon Bedrock knowledge base. Use the knowledge base to provide context when users submit prompts to Amazon Bedrock.
Cost-Effectiveness: Amazon Bedrock knowledge bases are designed specifically for Retrieval-Augmented Generation (RAG) applications. They provide an efficient way to retrieve relevant information from documents without needing to include entire documents in every prompt.
Efficient Context Management: Knowledge bases automatically:
Scalability: As the number of PDF files grows, knowledge bases handle the scaling efficiently without increasing costs linearly.
Option A & B (Prompt Engineering with PDF context):
Option C (Fine-tuning a model):
This approach provides the best balance of accuracy, performance, and cost for a chat interface with product manuals.