
Ultimate access to all questions.
Deep dive into the quiz with AI chat providers.
We prepare a focused prompt with your quiz and certificate details so each AI can offer a more tailored, in-depth explanation.
Q1 – Which component in Transformer architecture enables the model to capture relationships between all words in a sentence simultaneously?
A
Recurrent loops
B
Self-Attention mechanism
Explanation:
The correct answer is B. Self-Attention mechanism.
The Transformer architecture, introduced in the paper "Attention Is All You Need" (Vaswani et al., 2017), revolutionized natural language processing by replacing recurrence with self-attention mechanisms, enabling more efficient parallel processing and better capture of contextual relationships.