
Ultimate access to all questions.
Deep dive into the quiz with AI chat providers.
We prepare a focused prompt with your quiz and certificate details so each AI can offer a more tailored, in-depth explanation.
A finance company fine-tunes a Bedrock model for client reports but finds the text too rigid. They want a slightly broader vocabulary without losing meaning. What should they modify?
A
Decrease temperature
B
Increase top-p
C
Reduce top-k
D
Increase max-tokens
Explanation:
Increasing top-p (nucleus sampling) is the correct approach because:
Top-p sampling controls the cumulative probability threshold for token selection
Higher top-p values (e.g., 0.9 instead of 0.5) allow the model to consider more diverse tokens while maintaining quality
This enables broader vocabulary without completely random or nonsensical outputs
The model still focuses on high-probability tokens, preserving meaning while increasing variety
Why other options are incorrect:
Decreasing temperature (A): Makes outputs more deterministic and rigid, opposite of what's needed
Reducing top-k (C): Limits vocabulary by considering fewer tokens, making outputs more constrained
Increasing max-tokens (D): Controls response length, not vocabulary diversity
Top-p sampling is ideal for this use case as it balances creativity with coherence, allowing the finance company to get more varied language while maintaining the professional tone and accuracy required for client reports.