
Ultimate access to all questions.
Deep dive into the quiz with AI chat providers.
We prepare a focused prompt with your quiz and certificate details so each AI can offer a more tailored, in-depth explanation.
An HR assistant built on Bedrock starts giving imaginative but off-topic answers. The team wants more factual responses. What should they change?
A
Decrease temperature
B
Increase top-p
C
Raise max-tokens
D
Disable stop-sequences
Explanation:
Temperature is a parameter that controls the randomness of the model's responses:
Why other options are incorrect:
Best Practice: For factual, HR-related applications where accuracy and relevance are critical, use a lower temperature setting (0.1-0.3) to ensure the model stays focused on providing accurate, on-topic information.