
Ultimate access to all questions.
Deep dive into the quiz with AI chat providers.
We prepare a focused prompt with your quiz and certificate details so each AI can offer a more tailored, in-depth explanation.
Q3 – Why do AI models count tokens instead of characters?
A
Tokens provide language-independent representation
B
Tokens are always shorter than characters
C
Characters cannot be encoded numerically
D
Tokens reduce context-window cost to zero
Explanation:
AI models count tokens instead of characters because:
Language Independence: Tokens provide a language-independent representation that works across different languages and scripts. Characters can vary significantly between languages, making them less consistent for processing.
Semantic Units: Tokens represent meaningful semantic units (words, subwords, or characters) rather than just individual characters. This allows models to better understand the structure and meaning of text.
Efficiency: Tokenization helps break down text into manageable units that can be processed more efficiently by neural networks.
Vocabulary Management: Tokenization allows models to handle large vocabularies by breaking rare words into subword units, reducing the vocabulary size while maintaining the ability to represent any word.
Why the other options are incorrect:
The correct answer is A because tokenization provides a language-independent way to represent text that works consistently across different languages and scripts.