
Ultimate access to all questions.
Deep dive into the quiz with AI chat providers.
We prepare a focused prompt with your quiz and certificate details so each AI can offer a more tailored, in-depth explanation.
An HR assistant model must answer employee policy questions based on the company handbook stored in an S3 bucket. The system retrieves relevant policy paragraphs and includes them in the prompt before generating the answer. Which prompting approach is being used?
A
Few-shot Prompting
B
Contextual / Retrieval-Augmented Prompting (RAG)
C
Zero-shot Prompting
D
Chain-of-Thought Prompting
Explanation:
This scenario describes Retrieval-Augmented Generation (RAG), which is a contextual prompting approach. Here's why:
External Knowledge Retrieval: The system retrieves relevant information from an external source (the company handbook stored in S3 bucket)
Context Injection: Retrieved content is included in the prompt before generating the answer
Dynamic Information Access: Unlike static prompts, RAG dynamically fetches relevant information based on the query
A) Few-shot Prompting: Involves providing examples in the prompt, not retrieving external documents
C) Zero-shot Prompting: No examples or external context provided - the model answers based solely on its pre-trained knowledge
D) Chain-of-Thought Prompting: Involves breaking down reasoning step-by-step, not retrieving external documents
This approach is commonly used in enterprise applications where:
Company policies, manuals, or documentation need to be referenced
Information is too large to fit in a single prompt
Documents are frequently updated and need to be accessed dynamically
The S3 bucket serves as the knowledge base, and the retrieval mechanism finds relevant policy paragraphs to provide context for accurate, up-to-date answers.