
Ultimate access to all questions.
Deep dive into the quiz with AI chat providers.
We prepare a focused prompt with your quiz and certificate details so each AI can offer a more tailored, in-depth explanation.
A social-media company uses Amazon Bedrock to draft unique post captions. The current outputs are too similar. What parameter should the team adjust?
A
Lower temperature to 0.2
B
Raise temperature to 0.9
C
Reduce max-tokens to 50
D
Set stop-sequence to "End of post"
Explanation:
Temperature is a parameter that controls the randomness and creativity of AI model outputs:
Lower temperature (e.g., 0.2): Makes outputs more deterministic, focused, and consistent - this would make captions even more similar
Higher temperature (e.g., 0.9): Increases randomness and creativity, leading to more diverse and unique outputs
Why the other options are incorrect:
Option A (Lower temperature to 0.2): This would make outputs even more similar and less creative
Option C (Reduce max-tokens to 50): This limits the length of the output but doesn't affect creativity or uniqueness
Option D (Set stop-sequence to "End of post"): This controls when the model stops generating text, not the creativity level
Amazon Bedrock Context:
Amazon Bedrock is a fully managed service that offers a choice of high-performing foundation models (FMs) from leading AI companies. When using Bedrock's text generation models, adjusting the temperature parameter is the correct way to control the creativity and uniqueness of outputs.