
Ultimate access to all questions.
Answer-first summary for fast verification
Answer: Increase the temperature to around 0.9 to encourage creative variation
## Explanation When using language models like those in Amazon Bedrock, the **temperature** parameter controls the randomness of the output: - **Lower temperature (e.g., 0.2)**: Makes the model more deterministic and focused on the most probable tokens, which can lead to more repetitive outputs - **Higher temperature (e.g., 0.9)**: Increases randomness and creativity by allowing the model to consider less probable tokens, resulting in more varied and original outputs In this scenario, the marketing agency is experiencing **repetitive outputs lacking originality**, which indicates the model is being too conservative. Increasing the temperature to around 0.9 would encourage more creative variation in the generated brand-tagline ideas. **Why other options are incorrect:** - **A (Lower temperature to 0.2)**: This would make outputs even more repetitive and deterministic - **C (Reduce top-p to 0.3)**: Top-p (nucleus sampling) controls token selection based on cumulative probability. Lowering it restricts the model to only the most probable tokens, reducing variety - **D (Add stop sequences)**: Stop sequences control when generation stops, not the creativity or variety of content **Key takeaway**: For creative tasks like marketing taglines, higher temperature values (0.7-0.9) typically yield better results by introducing more randomness and originality.
Author: Ritesh Yadav
No comments yet.
A marketing agency is using Amazon Bedrock to generate brand-tagline ideas. The outputs feel repetitive and lack originality. What adjustment should the data-science team make?
A
Lower the temperature to 0.2 for more precise outputs
B
Increase the temperature to around 0.9 to encourage creative variation
C
Reduce the top-p to 0.3 to increase determinism
D
Add stop sequences to control token length