A risk analyst is evaluating a company's assumptions for pricing foreign currency options. They observe an unusual pattern in implied volatility: it is relatively low for at-the-money options but increases significantly as the options move either into in-the-money or out-of-the-money territory. Given this observation, how would the distribution of option prices for this foreign currency, as predicted by the Black-Scholes-Merton model, compare to a lognormal distribution with the same mean and standard deviation?