
Ultimate access to all questions.
You are a junior Data Scientist working on a project that requires the development of a multi-class classification model using the Keras Sequential API. The model will classify images into one of 10 possible categories. Given the project's requirements for high accuracy and the need for the model to output probabilities for each class that sum to 1, which activation function should you use for the output layer? Choose the best option. (Choose one correct option)
A
ReLU, as it helps in mitigating the vanishing gradient problem and is commonly used in hidden layers of neural networks.
B
TANH, because it outputs values between -1 and 1, making it suitable for models requiring normalized outputs.
C
Softmax, as it assigns probabilities to each class such that their sum equals 1, making it ideal for multi-class classification problems.
D
SIGMOID, since it outputs a value between 0 and 1, suitable for binary classification tasks.
E
None of the above, as a custom activation function is required for this specific use case.