
Answer-first summary for fast verification
Answer: Short-term working memory
## Explanation The **context window** in AI models refers to the amount of text or tokens that a model can process and "remember" during a single inference or generation. This is most similar to **short-term working memory** because: - **Short-term working memory** in humans is temporary storage that holds a limited amount of information for immediate processing - Similarly, a model's context window temporarily holds recent inputs and outputs during a conversation or task - Both have capacity limitations and are used for immediate processing rather than long-term storage - Information outside the context window is "forgotten" by the model, just like how information fades from short-term memory **Why other options are incorrect:** - **Long-term memory** refers to permanent storage, while context window is temporary - **Neural embedding space** is about vector representations, not memory capacity - **Batch processing unit** relates to computational processing, not memory functionality The context window determines how much contextual information a model can maintain for coherent responses, making it analogous to short-term working memory in cognitive systems.
Author: Ritesh Yadav
Ultimate access to all questions.
No comments yet.