
Ultimate access to all questions.
Deep dive into the quiz with AI chat providers.
We prepare a focused prompt with your quiz and certificate details so each AI can offer a more tailored, in-depth explanation.
Which component in Transformer architecture enables the model to capture relationships between all words in a sentence simultaneously?
A
Recurrent loops
B
Self-Attention mechanism
Explanation:
The correct answer is B. Self-Attention mechanism.
Explanation:
Self-Attention mechanism is the core component in Transformer architecture that allows the model to weigh the importance of different words in a sentence relative to each other. It processes all words in parallel and captures contextual relationships simultaneously, unlike sequential models.
Recurrent loops (Option A) are used in RNNs and LSTMs, which process sequences sequentially and cannot capture all relationships simultaneously due to their sequential nature.
The Self-Attention mechanism enables Transformers to handle long-range dependencies efficiently and has become fundamental in modern NLP models like BERT and GPT.