
Answer-first summary for fast verification
Answer: Categorical Cross Entropy: Ideal for multi-class classification problems where each input belongs to exactly one class, measuring the difference between true labels (one-hot encoded) and predictions effectively.
**Correct Option:** D. Categorical Cross Entropy: This is the correct choice because it is specifically designed for multi-class classification problems where each input belongs to exactly one class. It effectively measures the difference between the true labels (one-hot encoded) and the predictions, making it suitable for this scenario. **Incorrect Options:** A. Categorical hinge: This is less common for image classification tasks and is more suited to problems where hinge loss is specifically required. B. Binary Cross Entropy: This is intended for binary classification tasks, not for problems with more than two classes. C. Sparse Categorical Cross Entropy: While suitable for multi-class classification, it is used when labels are provided as integers, not one-hot encoded vectors, making it less appropriate here.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
Your team is developing a machine learning model to classify images into three distinct categories: driver's licenses, passports, and credit cards. The dataset provided consists of 10,000 images of driver's licenses, 1,000 passports, and 1,000 credit cards, each accurately labeled. Given the imbalance in the dataset and the requirement for high accuracy in classification, which loss function would be most suitable for training this model? Consider the need for efficient training and the model's performance on the minority classes. Choose the best option from the following:
A
Categorical hinge: Suitable for problems where hinge loss is specifically required, but less common for image classification tasks.
B
Binary Cross Entropy: Designed for binary classification tasks, making it unsuitable for this multi-class problem.
C
Sparse Categorical Cross Entropy: Appropriate for multi-class classification when labels are integers, not one-hot encoded vectors.
D
Categorical Cross Entropy: Ideal for multi-class classification problems where each input belongs to exactly one class, measuring the difference between true labels (one-hot encoded) and predictions effectively.
No comments yet.