
Financial Risk Manager Part 1
Get started today
Ultimate access to all questions.
Joel Matip, FRM, is running a regression model to forecast in-sample data. He's worried about data mining and over-fitting the data. The criterion that provides the highest penalty factor based on degrees of freedom is the:
Exam-Like
Community
TTanishq
Explanation:
Explanation
The Schwarz Information Criterion (SIC), also known as the Bayesian Information Criterion (BIC), provides the highest penalty factor based on degrees of freedom among the listed criteria.
Key Points:
- SIC/BIC has a penalty term of k·ln(n), where k is the number of parameters and n is the sample size
- Akaike Information Criterion (AIC) has a penalty term of 2k
- Mean Squared Error (MSE) and Unbiased Mean Squared Error do not include explicit penalty terms for model complexity
Why SIC has the highest penalty:
- For sample sizes n > 7, ln(n) > 2, making SIC's penalty larger than AIC's
- SIC is more conservative and less prone to overfitting
- SIC tends to select simpler models compared to AIC
- This makes SIC particularly useful when concerned about data mining and overfitting
Mathematical Comparison:
- AIC: -2ln(L) + 2k
- SIC/BIC: -2ln(L) + k·ln(n)
- Since ln(n) grows with sample size, SIC imposes a stronger penalty for additional parameters
Therefore, when worried about overfitting, SIC is the preferred criterion due to its higher penalty for model complexity.
Comments
Loading comments...