
Ultimate access to all questions.
To achieve more consistent responses from an LLM on Amazon Bedrock for sentiment analysis, which inference parameter should be adjusted?
A
Decrease the temperature value.
B
Increase the temperature value.
C
Decrease the length of output tokens.
D
Increase the maximum generation length.