
Answer-first summary for fast verification
Answer: Raise temperature to 0.9
## Explanation **Temperature** controls the randomness or creativity of the model's output: - **Lower temperature (e.g., 0.2)**: Makes outputs more deterministic, focused, and consistent - leads to more similar responses - **Higher temperature (e.g., 0.9)**: Increases randomness and creativity, producing more diverse and unique outputs Since the current outputs are "too similar," the team needs to increase diversity by **raising the temperature**. Option B (Raise temperature to 0.9) is correct because: 1. Higher temperature introduces more randomness in token selection 2. This creates more varied and unique post captions 3. Temperature values typically range from 0 to 1, with 0.9 being high enough to increase diversity without making outputs completely random **Why other options are incorrect:** - **A**: Lowering temperature would make outputs even more similar - **C**: Reducing max-tokens only limits response length, not diversity - **D**: Stop-sequence controls when generation stops, not creativity
Author: Jin H
Ultimate access to all questions.
No comments yet.
A social-media company uses Amazon Bedrock to draft unique post captions. The current outputs are too similar. What parameter should the team adjust?
A
Lower temperature to 0.2
B
Raise temperature to 0.9
C
Reduce max-tokens to 50
D
Set stop-sequence to "End of post"