RiskNET eLibrary


Jede unternehmerische Entscheidung ist unweigerlich mit dem Eingehen von Risiken verbunden. Diese Risiken sind umso höher, je vielschichtiger die Geschäftstätigkeit und je komplexer die abzusetzende Leistung werden. Speziell kleine und mittlere technologieorientierte Unternehmen, welche aufgrund allgegenwärtiger Ressourcenknappheit und branchenspezifischen Rahmenbedingungen (wie z.B. hoher Technologiedynamik) einem hohen Innovationsdruck ausgesetzt sind, verkennen oftmals die Bedeutung dieser Tatsache und versuchen, wenn überhaupt, ein Risikomanagementsystem einzuführen, dessen Nachhaltigkeit durchaus in Frage zu stellen ist.
weissi13 3405 Downloads 05.02.2010
Datei downloaden
To perform risk and portfolio management, we must represent the distribution of the risk factors that affect the market. The most flexible approach is in terms of scenarios and their probabilities, which includes historical scenarios, pure Monte Carlo and importance sampling (see Glasserman, 2004). Here, we present a simple method to generate scenarios from elliptical distributions with given sample means and covariances. This is very important in applications such as mean-variance portfolio optimisation, which are heavily affected by incorrect representations of the first two moments.
[Author: Attilio Meucci is head of research at Bloomberg ALPHA, Portfolio Analytics and Risk, in New York]
Meucci 2980 Downloads 20.01.2010
Datei downloaden
Enterprise risk management (ERM) has been the topic of increased media attention in recent years. Many organizations have implemented ERM programs, consulting firms have established specialized ERM units, and universities have developed ERM-related courses and research centers. Despite the heightened interest in ERM by academics and practitioners, there is an absence of empirical evidence regarding the impact of such programs on firm value. The objective of this study is to measure the extent to which specific firms have implemented ERM programs and, then, to assess the value implications of these programs. We focus our attention in this study on U.S. insurers in order to control for differences that might arise from regulatory and market differences across industries. We use a maximum-likelihood treatment effects framework to simultaneously model the determinants of ERM and the effect of ERM on firm value. In our ERM-choice equation we find ERM usage to be positively related to factors such as firm size and institutional ownership, and negatively related to reinsurance use, leverage, and asset opacity. By focusing on publicly-traded insurers we are able to estimate the effect of ERM on Tobin’s Q, a standard proxy for firm value. We find a positive relation between firm value and the use of ERM. The ERM premium of 16.5% is statistically and economically significant and is robust to a range of alternative specifications of both the ERM and value equations.
[Authors: Ron S. Kenett, KPA Ltd., University of Torino, Torino and Center for Risk Engineering, NYU Polytechnic Institute, New York (USA) and Charles S. Tapiero, Center for Risk Engineering, NYU Polytechnic Institute, New York (USA). To be presented at the 2009 Quality and Productivity Research Conference, IBM T. J. Watson Research Ctr., Yorktown Heights, NY, 3-5 June 2009.]
RSK 2575 Downloads 14.01.2010
Datei downloaden
Quantitative risk management relies on a constellation of tools that are used to analyze portfolio risk. We develop the standard toolkit, which includes betas, risk budgets and correlations, in a general, coherent, mnemonic framework centered around marginal risk contributions. We apply these tools to generate side-by-side analyses of volatility and expected shortfall, which is a measure of average portfolio excess of value-at-risk. We focus on two examples whose importance is highlighted by the current economic crisis. By examining downside protection provided by an out-of-the-money put option we show that the diversification benefit of the option is greater for a risk measure that is more highly concentrated in the tail of the distribution. By comparing two-asset portfolios that are distinguished only by the likelihood of coincident extreme events, we show that expected shortfall measures market contagion in a way that volatility cannot.
[Autors: Lisa R. Goldberg, Michael Y. Hayes, Jose Menchero, Indrajit Mitra]
Goldberg 1467 Downloads 06.01.2010
Datei downloaden
How well does options pricing theory really work, and how dependent is it on the notion of dynamic replication? In this note we describe what many practitioners know from long and practical experience: (i) dynamic replication doesn’t work as well as students are taught to believe; (ii) most derivatives traders rely on it as little as possible; and (iii) there is a much simpler way to derive many option pricing formulas: many of the results of dynamic option replication can be obtained more simply, by regarding (as many practitioners do) an options valuation model as an interpolating formula for a hybrid security that correctly matches the boundary values of the ingredient securities that constitute the hybrid.
[Authors: Emanuel Derman; Nassim Nocholas Taleb / Source: Quantitative Finance, Vol. 5, No. 4, August 2005, 323–326]
TBS 1235 Downloads 05.01.2010
Datei downloaden
The Black Swan: The Impact of the Highly Improbable (hence TBS) is only critical of statistics, statisticians, or users of statistics in a very narrow (but consequential) set of circumstances. It was written by a veteran practitioner of uncertainty whose profession (a mixture of quantitative research, derivatives pricing, and risk management) estimates and deals with exposures to higher order statistical properties. Derivatives depend on some nonlinear function of random variables (often square or cubes) and are therefore extremely sensitive to estimation errors of the higher moments of probability distributions. This is the closest to applied statistician one can possibly get. Furthermore, TBS notes the astonishing success of statistics as an engine of scientific knowledge in (1) some well-charted domains such as measurement errors, gambling theory, thermodynamics, and quantum mechanics (these fall under the designation of "mild randomness"), or (2) some applications in which our vulnerability to errors is small. Indeed, statistics has been very successful in "low moment" applications such as "significance testing" for problems based on probability, not expectation or higher moments. In psychological experiments, for instance, the outlier counts as a single observation, and does not cause a high impact beyond its frequency.
[Authors: Nassim Nicholas Taleb / Source: The American Statistician, August 2007, Vol. 61, No. 3]
TBS 1270 Downloads 05.01.2010
Datei downloaden
Several generalisations of the Black–Scholes (BS) Model have been made in the literature to overcome the well–known empirical inadequacies of the BS–Model. In this work I perform an empirical comparison of stochastic volatility models established by Du?e et al. (2000) with jumps in the volatility and four deductive special cases. In addition I include the model of Schoebel/Zhu (1999) with volatility driven by an Ornstein–Uhlenbeck process instead of a Cox–Ingersoll–Ross process. As Zhu (2000) suggested the model can be easily combined with a jump component in the underlying. I examine the resulting model empirically and stress its good properties. This comparison embeds out–of–sample pricing performance as an important element in a model performance study based on model risk. The main result in terms of ?t performance is that the most complex models are not always the best ones. It is important to quantify model risk like e.g. Cont (2004) and to examine the sensitivity of exotic options in terms of moneyness, maturity and market condition. To achieve this comparison the model risk measure of Cont (2004) is extended and applied to various exotic options.
manuelaender 2720 Downloads 15.11.2009
Datei downloaden
Die Entwicklung einer Ratingstrategie zur Absicherung des Unternehmens ist methodisch eng verbunden mit anderen Herausforderungen der Unternehmensführung – was jedoch oft übersehen wird. In diesem Beitrag wird erläutert, welche Zusammenhänge zwischen Ratingstrategien und Ratingprognosen auf der einen Seite sowie der Ausschüttungspolitik und der Optimierung des Risikotransfers (speziell der Versicherungslösung) andererseits bestehen. Es wird dabei verdeutlicht, dass sowohl die fundierte Planung der Ausschüttungspolitik eines Unternehmens als auch die Bestimmung eines angemessenen Versicherungsschutzes (speziell von angemessenen Selbstbehalten) nicht möglich ist, wenn nicht jeweils die Implikationen für das zukünftige Rating betrachtet werden. Zudem wird die Bedeutung der Instrumente für die Krisenprävention von Unternehmen aufgezeigt, z. B. mittels Stresstests.
[Quelle: Ratingprognose, Solvenztest und Rating-Impact-Analyse: Neue Instrumente für Krisenprävention und Ratingstrategie, in: KRP Kredit & Rating Praxis, Ausgabe 03/2009, S. 38-40]
Gleissner 1622 Downloads 22.10.2009
Datei downloaden
Die aktuelle Finanzkrise zeigt, dass Krisendiagnose und präventives Krisenmanagement grundsätzlich bei der Unternehmenssteuerung berücksichtigt werden sollten. Der vorliegende Beitrag beschreibt, wie mit Hilfe von Simulationsverfahren Ratingprognosen erstellt sowie der Eigenkapitalbedarf eines Unternehmens zur Risikodeckung und damit zur Krisenprävention ermittelt werden können. Weiterhin werden strategische Maßnahmen aufgezeigt sowie Anleitungen für das Kostenmanagement und die Liquiditätssicherung gegeben.
[Quelle: Werner Gleißner/Armin Schaller: Krisendiagnose und Krisenmanagement - Maßnahmenpalette: Von der Ratingprognose bis zur Liquiditätssicherung, in: KSI 04/2009, S. 153-161]
Gleissner 2749 Downloads 22.10.2009
Datei downloaden
Betrachtet man das Verhalten von Geschäfts- und Investmentbanken in der US-Immobilien- und Verbriefungsblase bis zum Sommer 2007, dann lautet die entscheidende Frage: Warum sind die Banken auf den Herdentrieb hereingefallen? Das individuelle Managerverhalten war dabei sehr rational, obwohl es sich um ein kollektives Fehlverhalten handelte – ein Widerspruch, der sich mit Hilfe der Spieltheorie erklären lässt. Die herkömmliche Portfoliotheorie von Markowitz taugt dafür hingegen nicht, denn sie hat einen gravierenden Konstruktionsfehler: Auf Aktionen eines individuellen Entscheidungsträgers folgen laut Markowitz definitionsgemäss nie Reaktionen anderer Marktteilnehmer. So wird Herdenverhalten einfach wegdefiniert. Im Gegensatz dazu ist die Spieltheorie eine Analysemethode, die Aktionen und Reaktionen vieler Entscheidungsträger systematisch miteinander verknüpft.
[Quelle: Bieta, V./Milde, H.: Herdenverhalten – Konsequenzen für Märkte, Banken und Aufseher, in: SCHWEIZER BANK , November 2008, S- 42-43]
Bieta 5332 Downloads 29.09.2009
Datei downloaden