
Financial Risk Manager Part 1
Get started today
Ultimate access to all questions.
Consider the following confusion matrices. Model A
| Actual: No default | Predicted: No default | Predicted: Default |
|---|---|---|
| TN = 100 | FP = 50 | |
| Actual: Default | FN = 50 | TP = 900 |
Exam-Like
Community
TTanishq
Explanation:
Explanation
Based on the confusion matrix provided:
- TN (True Negative) = 100: Correctly predicted no default when there was no default
- FP (False Positive) = 50: Incorrectly predicted default when there was no default
- FN (False Negative) = 50: Incorrectly predicted no default when there was default
- TP (True Positive) = 900: Correctly predicted default when there was default
To determine which value represents the correct predictions of defaults, we look at TP (True Positive) = 900, which corresponds to option D.
Key Metrics:
- Accuracy = (TP + TN) / Total = (900 + 100) / (100 + 50 + 50 + 900) = 1000 / 1100 β 90.9%
- Precision = TP / (TP + FP) = 900 / (900 + 50) = 900 / 950 β 94.7%
- Recall/Sensitivity = TP / (TP + FN) = 900 / (900 + 50) = 900 / 950 β 94.7%
- Specificity = TN / (TN + FP) = 100 / (100 + 50) = 100 / 150 β 66.7%
The model shows good performance in detecting defaults (high recall and precision) but has lower specificity in correctly identifying non-defaults.
Comments
Loading comments...