Which of the following are valid approaches for extreme value analysis given a dataset:
1. The Block Maxima approach
2. Least squares approach
3. Maximum likelihood approach
4. Peak-over-thresholds approach
For EVT, we use the block maxima or the peaks-over-threshold methods. These provide us the data points that can be fitted to a GEV distribution.
Least squares and maximum likelihood are methods that are used for curve fitting, and they have a variety of applications across risk management.
Financial institutions need to take volatility clustering into account:
1. To avoid taking on an undesirable level of risk
2. To know the right level of capital they need to hold
3. To meet regulatory requirements
4. To account for mean reversion in returns
Volatility clustering leads to levels of current volatility that can be significantly different from long run averages. When volatility is running high, institutions need to shed risk, and when it is running low, they can afford to increase returns by taking on more risk for a given amount of capital. An institution's response to changes in volatility can be either to adjust risk, or capital, or both. Accounting for volatility clustering helps institutions manage their risk and capital and therefore statements I and II are correct.
Regulatory requirements do not require volatility clustering to be taken into account (at least not yet). Therefore statement III is not correct, and neither is IV which is completely unrelated to volatility clustering.
Which of the following are valid approaches for extreme value analysis given a dataset:
1. The Block Maxima approach
2. Least squares approach
3. Maximum likelihood approach
4. Peak-over-thresholds approach
For EVT, we use the block maxima or the peaks-over-threshold methods. These provide us the data points that can be fitted to a GEV distribution.
Least squares and maximum likelihood are methods that are used for curve fitting, and they have a variety of applications across risk management.
Which of the following belong to the family of generalized extreme value distributions:
1. Frechet
2. Gumbel
3. Weibull
4. Exponential
Extreme value theory focuses on the extreme and rare events, and in the case of VaR calculations, it is focused on the right tail of the loss distribution. In very simple and non-technical terms, EVT says the following:
1. Pull a number of large iid random samples from the population,
2. For each sample, find the maximum,
3. Then the distribution of these maximum values will follow a Generalized Extreme Value distribution.
(In some ways, it is parallel to the central limit theorem which says that the the mean of a large number of random samples pulled from any population follows a normal distribution, regardless of the distribution of the underlying population.)
Generalized Extreme Value (GEV) distributions have three parameters: (shape parameter), (location parameter) and (scale parameter). Based upon the value of , a GEV distribution may either be a Frechet, Weibull or a Gumbel. These are the only three types of extreme value distributions.
Which of the following are valid approaches for extreme value analysis given a dataset:
1. The Block Maxima approach
2. Least squares approach
3. Maximum likelihood approach
4. Peak-over-thresholds approach
For EVT, we use the block maxima or the peaks-over-threshold methods. These provide us the data points that can be fitted to a GEV distribution.
Least squares and maximum likelihood are methods that are used for curve fitting, and they have a variety of applications across risk management.
Currently there are no comments in this discussion, be the first to comment!