
Ultimate access to all questions.
Answer-first summary for fast verification
Answer: Provide examples of text passages with corresponding positive or negative labels in the prompt followed by the new text passage to be classified.
## Explanation **Option A is correct** because this approach uses **few-shot learning** or **in-context learning**, which is an effective prompt engineering strategy for classification tasks like sentiment analysis. By providing examples of text passages with their corresponding labels (positive/negative), you're teaching the LLM the pattern it needs to follow for the new text passage. **Why other options are incorrect:** - **Option B**: While providing context about sentiment analysis might be helpful, it doesn't give the LLM concrete examples to learn from. LLMs benefit more from demonstration than explanation. - **Option C**: Providing no context or examples is **zero-shot learning**, which is less effective for classification tasks where the model needs to understand the specific format and criteria for labeling. - **Option D**: Examples of unrelated tasks (like text summarization or question answering) would confuse the model rather than help it learn the specific sentiment analysis task. **Key Concept:** In prompt engineering for classification tasks, **few-shot learning** (providing examples with labels) is typically more effective than zero-shot or unrelated examples because it demonstrates the exact pattern the model should follow.
Author: Ritesh Yadav
No comments yet.
A company wants to use a large language model (LLM) on Amazon Bedrock for sentiment analysis. The company wants to classify the sentiment of text passages as positive or negative.
Which prompt engineering strategy meets these requirements?
A
Provide examples of text passages with corresponding positive or negative labels in the prompt followed by the new text passage to be classified.
B
Provide a detailed explanation of sentiment analysis and how LLMs work in the prompt.
C
Provide the new text passage to be classified without any additional context or examples.
D
Provide the new text passage with a few examples of unrelated tasks, such as text summarization or question answering.