
Answer-first summary for fast verification
Answer: Sparse categorical cross-entropy
The correct answer is D: Sparse categorical cross-entropy. This loss function is appropriate because the classes (driver's license, passport, credit card) are mutually exclusive and each sample belongs to exactly one class. Sparse categorical cross-entropy is used when the labels are provided as integers rather than one-hot encoded vectors, which can save memory and make computation more efficient. For instance, the labels in this case could be encoded as 0 for driver's license, 1 for passport, and 2 for credit card. This makes sparse categorical cross-entropy a better fit than categorical cross-entropy, which would require one-hot encoding of labels.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
Your team is tasked with developing an image classification model to identify whether an image contains a driver's license, passport, or credit card. The data engineering team has already constructed the data pipeline and prepared a dataset with 10,000 images of driver's licenses, 1,000 images of passports, and 1,000 images of credit cards. You are required to use this dataset to train a machine learning model, with labels mapped as ['drivers_license', 'passport', 'credit_card']. Considering these classes are mutually exclusive, which appropriate loss function should you use?
A
Categorical hinge
B
Binary cross-entropy
C
Categorical cross-entropy
D
Sparse categorical cross-entropy