
Answer-first summary for fast verification
Answer: Increase top-p
## Explanation **Correct Answer: B (Increase top-p)** **Why this is correct:** 1. **Top-p (nucleus sampling)** controls the diversity of vocabulary by selecting from the smallest set of tokens whose cumulative probability exceeds the threshold p. 2. **Increasing top-p** (e.g., from 0.7 to 0.9) allows the model to consider a broader range of possible tokens while still maintaining high probability tokens, which helps achieve a slightly broader vocabulary without losing meaning. 3. This approach is more controlled than temperature adjustments because it maintains probability-based selection rather than just flattening the distribution. **Why other options are incorrect:** - **A. Decrease temperature**: Lowering temperature makes the model more deterministic and less creative, which would make the text even more rigid (opposite of what's needed). - **C. Reduce top-k**: Reducing top-k limits the number of tokens considered at each step, which would make the vocabulary even more constrained. - **D. Increase max-tokens**: This only affects the maximum length of generated text, not the vocabulary diversity or creativity. **Key Concepts:** - **Temperature**: Controls randomness (higher = more random, lower = more deterministic) - **Top-p (nucleus sampling)**: Controls vocabulary diversity by probability threshold - **Top-k**: Limits the number of tokens considered at each step - **Max-tokens**: Controls maximum output length For the specific requirement of "slightly broader vocabulary without losing meaning," increasing top-p is the most appropriate adjustment as it allows for more lexical variety while maintaining semantic coherence.
Author: Jin H
Ultimate access to all questions.
A finance company fine-tunes a Bedrock model for client reports but finds the text too rigid. They want a slightly broader vocabulary without losing meaning. What should they modify?
A
Decrease temperature
B
Increase top-p
C
Reduce top-k
D
Increase max-tokens
No comments yet.