
Answer-first summary for fast verification
Answer: Allows the model to consider a broader range of likely next words, increasing variability
**Explanation:** Top-p (nucleus sampling) is a parameter that controls the diversity of generated text by selecting from the smallest set of tokens whose cumulative probability exceeds the top-p value. - **Increasing top-p value** (e.g., from 0.7 to 0.9) means the model considers a broader range of likely next words, as it includes more tokens in the sampling pool. This increases variability and creativity in the output. - **Decreasing top-p value** (e.g., from 0.9 to 0.5) restricts the model to only the most probable tokens, making the output more focused and deterministic. For marketing slogans, you want creativity but not complete randomness, so adjusting top-p appropriately helps balance between focused creativity and excessive randomness.
Author: Jin H
Ultimate access to all questions.
You are configuring a Bedrock model for generating marketing slogans. To ensure the output remains creative but not too random, you adjust the top-p (nucleus sampling) parameter. What effect does increasing the top-p value have?
A
Limits the model to only the top 1% of probable words
B
Allows the model to consider a broader range of likely next words, increasing variability
C
Makes the model more deterministic and repetitive
D
Forces the model to only select the most probable token each time
No comments yet.