
Ultimate access to all questions.
Deep dive into the quiz with AI chat providers.
We prepare a focused prompt with your quiz and certificate details so each AI can offer a more tailored, in-depth explanation.
Q2. An e-commerce platform wants to build a product recommendation system that finds similar items based on text descriptions. They plan to convert text into numeric vectors first. Which technique should they use?
A
Tokenization
B
Bag-of-Words
C
Text Embeddings
D
Stemming
Explanation:
Text Embeddings is the correct answer because:
Purpose: Text embeddings convert text into dense numeric vectors (embeddings) that capture semantic meaning and relationships between words and phrases.
Similarity Search: Embeddings create vector representations where similar items have vectors that are close together in the vector space, making them ideal for finding similar products based on text descriptions.
Comparison with other options:
Real-world application: For e-commerce product recommendations, text embeddings (like Word2Vec, GloVe, or BERT embeddings) can capture that "laptop" and "notebook computer" are similar, even if they don't share exact words.
AWS context: AWS offers services like Amazon SageMaker with built-in algorithms for text embeddings, and Amazon Kendra for semantic search capabilities.