
Answer-first summary for fast verification
Answer: Low temperature and fixed random seed
**Explanation:** To achieve consistent, reproducible outputs across multiple calls with the same input, the correct configuration is **low temperature and fixed random seed**. Here's why: 1. **Temperature**: Controls the randomness of predictions. Low temperature (e.g., 0.1) makes the model more deterministic and focused on the highest probability tokens, while high temperature (e.g., 0.9) increases randomness and creativity. 2. **Random Seed**: A fixed seed ensures that the random number generator produces the same sequence of random numbers each time, leading to reproducible results. 3. **Why other options are incorrect**: - **A) High temperature**: Increases randomness, making outputs less consistent. - **B) Randomized seed**: Using different seeds each time would produce different outputs even with the same input. - **D) High top-p value**: Top-p (nucleus sampling) controls the cumulative probability threshold for token selection. High top-p values allow more tokens to be considered, increasing variability rather than consistency. **Key Concept**: For reproducible AI outputs, combine low temperature (to reduce randomness) with a fixed random seed (to ensure consistent random number generation). This is essential for testing, debugging, and production applications where consistency is required.
Author: Jin H
Ultimate access to all questions.
No comments yet.
Q5. A product manager wants consistent, reproducible outputs across multiple calls to the same model using the same input. Which configuration should they use?
A
High temperature
B
Randomized seed
C
Low temperature and fixed random seed
D
High top-p value