
Answer-first summary for fast verification
Answer: $\frac{1}{1 + e^{y_i}}$
## Explanation In logistic regression, the probability that the dependent variable equals 1 is given by the logistic function: $$P(y_i = 1) = \frac{1}{1 + e^{-(\alpha + \beta_1 x_{1i} + \beta_2 x_{2i} + \beta_3 x_{3i})}} = \frac{1}{1 + e^{-y_i}}$$ Since this is a binary classification problem where $y_i$ can only be 0 or 1, the probability that $y_i = 0$ is the complement: $$P(y_i = 0) = 1 - P(y_i = 1) = 1 - \frac{1}{1 + e^{-y_i}} = \frac{e^{-y_i}}{1 + e^{-y_i}} = \frac{1}{1 + e^{y_i}}$$ Therefore, the correct answer is **$\frac{1}{1 + e^{y_i}}$**. **Key Points:** - Logistic regression transforms linear combinations into probabilities using the logistic function - The probability of class 0 is the complement of the probability of class 1 - The transformation ensures probabilities are bounded between 0 and 1
Author: LeetQuiz .
Ultimate access to all questions.
The logistic regression is well-recognized for its capability in dealing with binary dependent variable. It uses a cumulative logistic function transformation to bind the output between 0 and 1. In other words, the logistic model effectively turns real values into probabilities. Suppose we now have 3 features, and the functional form is estimated as: What is the probability that ?
A
B
C
D
No comments yet.