
Answer-first summary for fast verification
Answer: Increase the temperature value.
## Explanation To increase the diversity and creativity of outputs from a large language model (LLM), the AI practitioner should adjust the **temperature parameter**. ### Why Option A (Increase the temperature value) is correct: 1. **Temperature controls randomness**: In LLM inference, temperature is a hyperparameter that directly influences the probability distribution during token generation. Higher temperature values (typically >1.0) flatten the probability distribution, making less likely tokens more probable to be selected. 2. **Mechanism of action**: When temperature is increased: - The model becomes less deterministic - It explores a wider range of possible continuations - Outputs become more varied and less predictable - This leads to more creative and diverse responses 3. **Practical application**: For creative writing, brainstorming, idea generation, or when multiple diverse solutions are needed, increasing temperature is the standard approach. ### Why other options are less suitable: **Option B (Decrease the Top K value)**: - Top K sampling limits the model to consider only the K most probable tokens at each step - Decreasing Top K makes the model more constrained, not more diverse - This would actually reduce creativity by narrowing the selection pool **Option C (Increase the response length)**: - Response length controls how many tokens the model generates - While longer responses might contain more content, they don't inherently increase diversity or creativity - The model could simply generate more of the same predictable content **Option D (Decrease the prompt length)**: - Prompt length affects the input context but doesn't directly control output diversity - Shorter prompts might lead to less specific outputs, but this doesn't systematically increase creativity - The relationship between prompt length and creativity is indirect and unreliable ### Best Practice Consideration: When adjusting temperature for increased creativity, practitioners should be aware that: - Very high temperatures (>1.5-2.0) can lead to incoherent or nonsensical outputs - The optimal temperature depends on the specific use case and model - It's often beneficial to experiment with temperature values between 0.7 and 1.3 for balanced creativity - Temperature adjustment is typically the first parameter to modify when seeking more diverse outputs, as it provides the most direct control over output randomness while maintaining coherence.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
To increase the diversity and creativity of outputs from a large language model (LLM), how should an AI practitioner modify the inference parameters?
A
Increase the temperature value.
B
Decrease the Top K value.
C
Increase the response length.
D
Decrease the prompt length.
No comments yet.