
Answer-first summary for fast verification
Answer: Few-shot prompting
## Explanation **Few-shot prompting** is the correct technique because: 1. **Definition**: Few-shot prompting involves providing the model with a few examples (typically 2-5) of input-output pairs to demonstrate the desired pattern or format before asking it to generate a response for a new input. 2. **Why it fits the scenario**: The team wants to "show a few example inputs and expected outputs within the prompt" to help the model produce consistent outputs. This is exactly what few-shot prompting does - it gives the model concrete examples to learn from. 3. **Comparison with other options**: - **Zero-shot prompting (A)**: The model is asked to perform a task without any examples. This wouldn't provide the consistency the team is looking for. - **Chain-of-thought prompting (C)**: This technique asks the model to explain its reasoning step-by-step, which is useful for complex reasoning tasks but not specifically for providing example patterns. - **Self-reflection prompting (D)**: This involves asking the model to critique or improve its own responses, which is different from providing example patterns. 4. **Practical application**: In Amazon Bedrock, few-shot prompting would involve structuring the prompt like: ``` Example 1: Input: "Show me all customers from New York" Output: "SELECT * FROM customers WHERE city = 'New York'" Example 2: Input: "Find orders placed in January 2024" Output: "SELECT * FROM orders WHERE order_date BETWEEN '2024-01-01' AND '2024-01-31'" Now generate SQL for: "Get products with price over $100" ``` This approach helps the model understand the specific format and style the team wants for SQL queries, leading to more consistent and accurate outputs.
Author: Ritesh Yadav
Ultimate access to all questions.
A data analytics team is using Amazon Bedrock to generate SQL queries from natural language prompts. They want the model to produce consistent outputs by showing a few example inputs and expected outputs within the prompt. Which prompting technique should they use?
A
Zero-shot prompting
B
Few-shot prompting
C
Chain-of-thought prompting
D
Self-reflection prompting
No comments yet.