
Ultimate access to all questions.
Deep dive into the quiz with AI chat providers.
We prepare a focused prompt with your quiz and certificate details so each AI can offer a more tailored, in-depth explanation.
Which component in Transformer architecture enables the model to capture relationships between all words in a sentence simultaneously?
A
Recurrent loops
B
Self-Attention mechanism
Explanation:
The correct answer is B. Self-Attention mechanism.
Explanation:
The Self-Attention mechanism enables Transformers to handle long-range dependencies efficiently and has become fundamental in modern NLP models like BERT and GPT.