 # Summary Part 4: Valuation and Risk Models

##### Course
- Part 4: Valuation and Risk Models
- N/A
- 2016 - 2017
- Tilburg University (Tilburg University, Tilburg)
- FRM - Level 1
182 Flashcards & Notes
Scroll down to see the PDF preview!
• This summary
• +380.000 other summaries
• A unique study tool
• A rehearsal system for this summary
• Studycoaching with videos
Remember faster, study better. Scientifically proven. • ## Reading 1 - Quantifying Volatility in VaR Models

• What about stochastic behavior of returns?
Measuring VaR involves identifying the tail of the distribution of asset returns. Usually based on parametric approach (assumptions) or else a non-parametric approach. In reality asset returns are not normal and deviate from Gaussian distribution:
- Fat-tailed (more probability weight in the tails)
- Skewed (declines are more severe than increases)
- Unstable (varying market conditions, lead to stochastic parameters)
• What are the effects of volatility changes?
It could be regime-switching volatility model, it can be high or low but is never in between, usually related to the state of the economy. Typical picture is that the estimated volatility trails/lags true volatility, leading to estimation error and estimation lag.
• Can conditional normality be salvaged?
Assume conditionally normally distributed? generally asset returns are non-normal both unconditionally as well as conditionally. Even with sophisticated estimation models! Large moves usually occur "out of the blue".

Joint model of conditional normality and volatility fails to account for fat tails as well. Assymetries and sharp movements are still observed, stress testing and scenario analysis are need in addition to VaR-based systems.
• What about VaR estimation approaches?
- Historical-based approaches (parametric=indirect via assumptions, non-parametric=directly, hybrid-approach=combination)
- Implied-volatility based approach (use Black-Scholes option pricing model, based on the implied volatility of at the money option(s))
Not only stochastic/time-varying but also sticky/predictable. The magnitude of recent changes are more informative than the sign itself. Arguments:
- Shorter windows are more volatile, the longer the period the smoother the series of estimators (and lower estimation error), due to sensitivity to extreme observations.
- Although less precise, short windows better adapt and allow for innovations
- Rolling estimation window can lead to shocks due to a roll out of extreme case. A gradual decline (EWMA) would be preferred.
• What about MDE, multivariate density estimation?
Estimate the joint probability density function of a set of variables. An intuitive alternative to the standard mining volatility forecasts. Weights on MDE depends on how the current state of the world compares to past states of the world. Kernel function. Serious problem is that it is data intensive. BUt in the end, MDE puts high weight on relevant information regardless of how far in the past this information is.
• What about a hybrid approach?
Combines the two simplest approaches: Historical Simulation and RiskMetrics by estimating percentiles of the return directly and using exponentially declining weights on past data. Advantage is the real observations are used and that the impact of an extreme observations gradually drops out of the sample (smaller weight). Linear interpolated returns are taken in between observations!

Biggest advantage: if returns become more stable the effect smoothly declines!!
• What about long-horizon volatility and VaR?
Square-root rule? Relies on two assumptions:
- non-predictability based on zero covariance (generally holds well). Equity is unpredictable but there is some predictability in fixed-income securities. Note that mean-reversion has an important effect on long-term volatility, as this implies autocorrelation. Reversion means negative covariance, implying that volatility is OVERSTATED by using the square-root of time.
- Volatility is the same in every period, they have a steady state but are stochastic. Note the difference with long run mean! But depending on today's volatility vs long run it either overstates (if higher) or understates (if lower).
Similar issues when estimating correlations? Exponentially declining weights provide benefits! But two specific issues:
- Correlation breakdown (in case of turmoil it increases)
- Non-synchronous data (not at the exact same time, Japan vs. US), can capture new news and events. Natural extension of random walk (consecutive hourly returns). Alternative is intensity of information flow is constant intraday.

Basically: assumption of independence extended to intraday independence