Chapter 1. Improved measures of financial risk for hedge funds . During the current financial crisis, several US and foreign banks and investment firms have failed due to excessive losses in some of their investments. Many of these financial institutions relied on a widely-used risk model known as the Value-at-Risk (VaR) to gauge the risks taken by their businesses. VaR does not properly account for the joint risks of investments based on more than one asset (i.e., the risk measure is not subadditive). Also, VaR computations are commonly based on the assumption that the probability distribution of asset returns is normal (Gaussian), which understates the probability of encountering large losses for some investments. To overcome the subadditivity flaw of VaR, researchers in financial economics have proposed an alternative measure, the Conditional Value-at-Risk (CVaR), which is defined as the expected losses that are strictly larger than the VaR. The CVaR measure may more appropriately compute the potential losses associated with holding two or more assets.
The purpose of this paper is to evaluate two recent innovations in the financial economics literature that may help banks and investment firms to properly assess the risks they face. First, we employ Extreme Value Theory (EVT) to estimate non-normal models of the return distribution tails. In particular, we use the peaks-over-threshold (POT) method in which extremes are defined as excesses over a threshold, and we estimate the marginal (univariate) return distributions. The POT observations are used to estimate the Generalized Pareto (GP) model of the upper and lower tail areas of the return distributions. Second, we use the estimated GP models to compare the relative performance of the VaR and CVaR for assessing forward-looking risk in observed hedge fund returns. The main objective of this analysis is to evaluate competing claims from the financial economics literature about the relative importance of the VaR flaws (e.g., subadditivity) and probability model specification errors in risk measurement.
Chapter 2. Optimal hedging under copula models with high frequency data. The current financial disturbance is partly due to the failure of risk management tools to warn of rapidly-evolving market events. One important lesson gained from this experience is that we must improve our ability to manage such financial risks by developing a better understanding of the microstructure of financial markets. Accordingly, we may be able to use models based on high frequency (e.g., intra-day) financial data to better assess the current risk of financial positions and to improve our predictions of future price movements. However, there are special problems associated with modeling high frequency data—for example, previous research shows that high frequency returns might be correlated in a nonlinear fashion. To handle these problems, we may use copula-based probability models, which represent the dependence structure and the univariate marginal properties of the risky asset returns. The methodology turns out to be useful in such a way that multivariate non-normality is readily modeled and the associated correlation parameter is easily updated on the basis of time varying structure. We estimate these models in order to determine optimal hedge ratios for currency futures positions used to manage price return risks in spot exchange rates. A Dynamic hedging strategy with futures contracts is considered to allow the hedge position to be adjusted over time. Various GARCH models are used to capture the volatility of the value of short futures positions coupled with foreign exchange rate fluctuations. For the purpose of measuring the conditional dependence between the two asset returns in a GARCH context, we use the Copula-based GARCH models.
Chapter 3. Copula model selection based on non-nested testing . The copula approach adopted in Essay 2 may be used to model any multivariate probability distribution by separately estimating the marginal distributions and the dependence structure. In practice, one needs to choose an appropriate copula model (from the many candidates) that provides the “best” fit to the observed data. We propose a non-nested test procedure for copula model selection that is based on the Cox test statistic, which is a centered version of the standard LR statistic. The Cox test and related non-nested testing methods hold conceptual advantages over the alternative tools mentioned above, but these methods are not widely used in practice due to computational difficulties. To resolve some of these practical challenges, we could use Monte Carlo sampling methods for computing the Cox test statistic and evaluating its distributional properties. The objective of this research is to propose a model selection procedure that is computationally feasible and statistically reliable in order to facilitate applications of these improved risk models in practice. (Abstract shortened by UMI.)
|Advisor:||Miller, Douglas J.|
|School:||University of Missouri - Columbia|
|School Location:||United States -- Missouri|
|Source:||DAI-A 73/10(E), Dissertation Abstracts International|
|Keywords:||CVaR, Copula method, Financial risk, GARCH, Model selection, Risk measure, VaR|
Copyright in each Dissertation and Thesis is retained by the author. All Rights Reserved
The supplemental file or files you are about to download were provided to ProQuest by the author as part of a
dissertation or thesis. The supplemental files are provided "AS IS" without warranty. ProQuest is not responsible for the
content, format or impact on the supplemental file(s) on our system. in some cases, the file type may be unknown or
may be a .exe file. We recommend caution as you open such files.
Copyright of the original materials contained in the supplemental file is retained by the author and your access to the
supplemental files is subject to the ProQuest Terms and Conditions of use.
Depending on the size of the file(s) you are downloading, the system may take some time to download them. Please be