
Ultimate access to all questions.
Deep dive into the quiz with AI chat providers.
We prepare a focused prompt with your quiz and certificate details so each AI can offer a more tailored, in-depth explanation.
A team notices that their LLM responses are too random and inconsistent. They want answers to be more deterministic and stable while still allowing slight variability. Which adjustment should they make?
A
Increase the temperature to 1.2
B
Decrease the temperature to 0.2
C
Increase top-k to 200
D
Increase top-p to 0.95
Explanation:
Explanation:
Temperature is a parameter that controls the randomness of LLM responses:
Why B is correct:
Why other options are incorrect:
Key Concept: Temperature is the primary parameter for controlling randomness vs. determinism in LLM outputs.