
Answer-first summary for fast verification
Answer: Set temperature = 0.7
## Explanation **Temperature** is a parameter that controls the randomness of AI model outputs: - **Temperature = 0.0**: Produces deterministic, predictable responses by always selecting the most probable next token. This eliminates randomness but can make responses repetitive and less creative. - **Temperature = 0.7**: Provides a good balance between randomness and relevance. It allows some variability while keeping responses mostly on-topic. **Why temperature = 0.7 is correct**: 1. The chatbot is currently producing **overly random responses with off-topic tokens** - this indicates the temperature is likely set too high (e.g., 1.0 or higher). 2. The team wants to **keep responses relevant** - lowering temperature helps with this. 3. The team still wants to **allow some variability** - temperature = 0.0 would eliminate all variability, making responses too predictable. 4. **Temperature = 0.7** is a commonly recommended value that provides a good balance between creativity and coherence. **Temperature scale**: - **0.0-0.3**: Very low randomness, highly predictable - **0.4-0.7**: Balanced randomness (recommended for most applications) - **0.8-1.0**: High randomness, more creative but less focused - **>1.0**: Very high randomness, often produces nonsensical or off-topic responses By setting temperature to 0.7, the chatbot will produce more relevant responses while maintaining enough variability to avoid sounding robotic or repetitive.
Author: Ritesh Yadav
Ultimate access to all questions.
A chatbot is producing overly random responses with off-topic tokens. The team wants to keep responses relevant but still allow some variability. Which action provides the best balance?
A
Set temperature = 0.7
B
Set temperature = 0.0
No comments yet.