
Ultimate access to all questions.
Deep dive into the quiz with AI chat providers.
We prepare a focused prompt with your quiz and certificate details so each AI can offer a more tailored, in-depth explanation.
An HR assistant built on Bedrock starts giving imaginative but off-topic answers. The team wants more factual responses.
What should they change?
A
Decrease temperature
B
Increase top-p
C
Raise max-tokens
D
Disable stop-sequences
Explanation:
Temperature parameter affects output creativity vs. factuality:
Lower temperature (0.0-0.3): More focused, deterministic, and factual responses
Higher temperature (0.7-1.0): More creative, imaginative, and potentially off-topic
Why A is correct:
The problem describes "imaginative but off-topic answers" - classic sign of high temperature
Decreasing temperature makes the model more focused and factual
This reduces randomness and keeps responses more grounded in the actual query
Why others are incorrect:
B: Increasing top-p would allow more diverse token selection, potentially making responses more creative
C: Raising max-tokens only affects response length, not content quality
D: Disabling stop-sequences could make responses run on without proper ending