
Ultimate access to all questions.
Deep dive into the quiz with AI chat providers.
We prepare a focused prompt with your quiz and certificate details so each AI can offer a more tailored, in-depth explanation.
A company is developing a chatbot using Amazon Bedrock. Before sending user input to a foundation model, the text must be broken down into smaller pieces that the model understands. What is this process called?
A
Stemming
B
Tokenization
C
Vectorization
D
Stopword removal
Explanation:
Explanation:
Tokenization is the process of breaking down text into smaller units called tokens that a language model can understand. In the context of foundation models like those used in Amazon Bedrock:
For foundation models in Amazon Bedrock, tokenization is the essential first step where input text is converted into tokens that the model's architecture can process.