
Financial Risk Manager Part 1
Get started today
Ultimate access to all questions.
An analyst runs a simulation to estimate the future value of an investment of $10,000 today over a 40-year period. He uses random monthly returns that are normally distributed. How does the analyst's situation create a discretization error bias?
Explanation:
Explanation
Discretization error bias arises when a continuous variable is incorrectly assumed to take on only discrete values. In this case, the analyst is assuming that returns are generated on a monthly basis, which is a discrete time interval. However, in reality, returns are generated continuously. This assumption of discrete returns can lead to a bias in the simulation results because it does not accurately reflect the continuous nature of returns. The error is termed as 'discretization error bias'. This bias can significantly affect the accuracy of the simulation, leading to potential misestimations of the future value of the investment.
Why other options are incorrect:
-
Choice A is incorrect: Using normally distributed returns does not lead to discretization error bias. In financial analysis, it is common to assume that returns follow a normal distribution due to the Central Limit Theorem, which states that the sum of a large number of independent and identically distributed variables will be approximately normally distributed.
-
Choice B is incorrect: The length of the simulation period (40 years in this case) does not contribute to discretization error bias. Discretization error arises from approximating a continuous process with discrete steps, not from the overall length of time being simulated.
-
Choice C is incorrect: Assuming that returns are random does not cause discretization error bias either. In fact, randomness is an inherent characteristic of financial markets and many financial models incorporate randomness in their assumptions.