
Ultimate access to all questions.
Deep dive into the quiz with AI chat providers.
We prepare a focused prompt with your quiz and certificate details so each AI can offer a more tailored, in-depth explanation.
A finance company fine-tunes a Bedrock model for client reports but finds the text too rigid. They want a slightly broader vocabulary without losing meaning. What should they modify?
A
Decrease temperature
B
Increase top-p
C
Reduce top-k
D
Increase max-tokens
Explanation:
Correct Answer: B (Increase top-p)
Why this is correct:
Top-p (Nucleus Sampling): This parameter controls the cumulative probability threshold for token selection. When you increase top-p, you allow the model to consider a broader set of possible tokens that collectively reach the specified probability threshold. This results in more diverse vocabulary while still maintaining coherence and meaning.
The problem context: The company wants "a slightly broader vocabulary without losing meaning." This is exactly what increasing top-p achieves - it expands the pool of considered tokens while still filtering out low-probability, nonsensical options.
Why other options are incorrect:
A. Decrease temperature: Temperature controls randomness - decreasing it makes output more deterministic and focused on high-probability tokens, which would make the text even MORE rigid, not less.
C. Reduce top-k: Top-k limits the number of tokens considered to the k most probable ones. Reducing top-k would make the output MORE constrained, not broader.
D. Increase max-tokens: This parameter controls the maximum length of generated text, not vocabulary diversity. Increasing it would simply allow longer responses, not affect vocabulary breadth.
Key concepts for AWS Certified Cloud Practitioner: