
Answer-first summary for fast verification
Answer: To introduce non-linearity in the model
The primary role of an activation function in neural networks, including those used in Spark ML, is to introduce non-linearity into the model. Neural networks are composed of layers of interconnected nodes (neurons), and activation functions are applied to the output of each node. By applying a non-linear activation function, the neural network becomes capable of learning complex, non-linear relationships within the data. Without activation functions, the entire network would behave as a linear function, and stacking linear functions would not increase the model's capacity to capture intricate patterns in the data. Activation functions, such as ReLU (Rectified Linear Unit) or sigmoid, introduce non-linearity by transforming the output of each node, enabling neural networks to learn and represent complex relationships in the data.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
No comments yet.