Abstracts
PDF files: Monday | Tuesday | Wednesday | Thursday | Friday | Saturday
Beatrice Acciaio - "Optimal Risk Allocation when Agents have Different Reference Probability Measures":
We study the problem of optimal sharing of an aggregate risk between agents whose preferences are represented by cash-invariant convex functionals. Under the hypothesis of law-invariance with respect to a given probability measure, the existence of optimal solutions is proved by Jouini, Schachermayer and Touzi (2005) and Acciaio (2006) (on the space of essentially bounded financial positions) and by Filipovic and Svindland (2007) (on Lp, for any p in [1,infinity]). Here we consider choice functionals which are law-invariant with respect to different probability measures. In this context we show the exactness of the optimal allocation problem on Lp, for any p in (1,infinity). Moreover, we provide a representation result for the convolution of such functionals on the space of essentially bounded financial positions.
Takuji Arai - "Optimal Hedging Strategies on Asymmetric Functions":
We treat optimal hedging problems for contingent claims in an incomplete financial market, which problems are based on asymmetric functions. More precisely, the optimal hedging which we consider in this talk, minimizes the expectation of an asymmetric function of the difference between the underlying contingent claim and the value of portfolio at the maturity. In particular, under some assumptions, we shall prove the unique existence of a solution and shall discuss its mathematical property.
Teitur Arnarson - "Early Exercise Boundary Regularity Close to Expiry in Indifference Setting":
We investigate the early exercise boundary near expiry for the indifference prices of the American call and put options. We show that, in this setting, the behavior of this boundary depends primarily on the regularity of the option payoff function in the point b_0, where b_0 is the limit point of the early exercise boundary as we approach the expiration time. The analysis is based on scaling and the so called blow-up technique instead of series expansion which is the common approach to these kind of problems.
Ole E. Barndorff-Nielsen - "Matrix Subordinators and Multivariate OU-based Volatility Models":
The concept of matrix subordinators is introduced and exemplified. A brief account of the relation to Upsilon transformations will be given. The second part of the talk discusses the use of matrix subordinators in multivariate generalisation of the one-dimensional OU-based stochastic volatility models.
Giovanni Barone-Adesi - "Barrier Option Pricing Using Adjusted Transition Probabilities":
In the existing literature on barrier options, much effort has been exerted to ensure convergence through placing the barrier in close proximity to, or directly onto, the nodes of the tree lattice. In this paper we show that this may not be necessary to achieve accurate option price approximations. Using the Cox/Ross/Rubinstein binomial tree model and a suitable transition probability adjustment we demonstrate that our "probability-adjusted" model exhibits increased convergence to the analytical option price. We study the convergence properties of various types of options including (but not limited to) double knock-out, exponential barrier, double (constant) linear barriers and linear time-varying barriers. For options whose strike price is close to the barrier we are able to obtain numerical results where other models fail and, although convergence tends to be slow, we are able to calculate reasonable approximations to the analytical option price without having to reposition the lattice nodes.
Nicole Bäuerle - "Dependence Properties and Comparison Results for Lévy Processes with Applications to Option Prices and Credit Risk":
We investigate dependence properties and comparison results for multidimensional Lévy processes. Association, positive orthant dependence and positive supermodular dependence of Lévy processes are characterized in terms of the Lévy measure as well as in terms of the Lévy copula, a concept which has been introduced recently. As far as comparisons of Lévy processes are concerned we consider the supermodular and the concordance order and characterize them by orders of the Lévy measures and by orders of the Lévy copulas respectively. These results can be applied to compare option prices or default time points of credit risks and credit swap rates. (The talk is based on joint work with A. Blatter, A. Müller and U. Schmock.)
Evren Baydar - POSTER "Optimal Portfolios of Options with Credit Risk":
In this paper, we study optimization problems of portfolios consisting of risky options. The framework of Korn and Trautmann (1999) is applied for the optimization problem, where we model the credit risk with a firm-value based approach. Since the underlying in the portfolio is a European type option on the risky bond written by the firm, the compound option formula of Geske (1979) is adapted for pricing reasons. (Joint work with Ralf Korn)
Christian Bayer - POSTER "Cubature on Wiener Space for Infinite Dimensional Problems":
We generalize the method of Cubature on Wiener space, introduced by T. Lyons and N. Victoir, to stochastic partial differential equations understood as stochastic differential equations on an infinite dimensional Hilbert space. Cubature on Wiener space seems to be well-suited for SPDEs because, it fits well with the concept of mild solutions. We present a few promising theoretical results and numerical examples.
Jörg Behrens - "Strategic Risk Management: Ideas and Questions":
While risk management is highly sophisticated and forms an integral part of the financial services industry, most firms still struggle to benefit from their analytical know-how when it comes to strategic planning. We discuss problems and ideas to bridge the gap between the two worlds.
Tomas Björk - "Optimal Investments Under Partial Information":
In this paper we consider optimal investment problems where the local rate of returns of the assets under consideration cannot be observed directly. The problem is that of finding the optimal portfolio strategy which maximizes the utility of terminal wealth. The constraint is that the portfolio has to be adapted to the filtration generated by observations of the asset price processes. This leads to an optimal control problem under partial information. For various special cases, problems of this kind has earlier been successfully studied by Brennan, Baeuerle, Rieder, Brendle, Nagai, Runggaldier and others. The contribution of the present paper is that we study a fairly general problem without any assumptions of a Markovian structure. Within this framework we obtain surprisingly explicit formulas for optimal wealth and optimal portfolio strategies. The existing results in the literature then come out as special cases of the general theory. (Joint work with Mark Davis and Camilla Landen)
Dorje C. Brody - "Information-Based Asset Pricing":
A new framework for asset price dynamics (due to Brody, Hughston, and Macrina) is introduced in which the concept of noisy information about future cash flows is used to derive the corresponding price processes. In this framework an asset is defined by its cash-flow structure. Each cash flow is modelled by a random variable that can be expressed as a function of a collection of independent random variables called market factors. with each such "X-factor'' we associate a market information process, the values of which are accessible to market participants. Each information process consists of a sum of two terms; one contains true information about the value of the associated market factor, and the other represents "noise''. The noise term is modelled by an independent Brownian bridge that spans the interval from the present to the time at which the value of the factor is revealed. The market filtration is generated by the aggregate of the independent information processes. The price of an asset is given by the expectation of the discounted cash flows in the risk-neutral measure, conditional on the information provided by the market filtration. In the case where the cash flow is the payment of a defaultable bond, an explicit model is obtained for the price process of a credit-risky bond, for which the associated derivative price and hedging strategies are also obtained. In the case where the cash flows are the dividend payments associated with equities, an explicit model is obtained for the share-price process. The prices of options on dividend-paying assets are derived. Remarkably, the resulting formula for the price of a European-style call option is of the Black-Scholes type. The information-based framework allows for a reasonable way to accommodate dependencies across different assets, and also generates a natural explanation for the origin of stochastic volatility in financial markets, without the need for specifying on an ad hoc basis the dynamics of the volatility. (Lecture Notes will be made available before the workshop.) (Download Slides (PDF))
Dorje C. Brody - "Dam Rain and Cumulative Gain":
Consider a financial contract that delivers a single cash flow given by the terminal value of a cumulative gains process. How can one model the dynamics of the price of such an asset, and determine the price processes of associated options and derivatives? This problem is important, for example, in the determination of optimal insurance claims reserve policies, and in the pricing of reinsurance contracts. In the insurance setting, the aggregate claims play the role of the cumulative gains, and the terminal cash flow represents the totality of the claims payable for the given accounting period. A similar example arises when we consider the accumulation of losses in a credit portfolio, and try to value a contract that pays an amount equal to the totality of the losses over a given time interval. The cumulative gains process is modelled by the product of the terminal cash flow and an independent gamma bridge process, and the market filtration is generated by this process. An explicit expression for the value process is obtained. The price of an elementary Arrow-Debreu security on the value of the cumulative gains process is determined, and is used to obtain closed-form expressions for the price of a European-style option on the value of the asset. The results obtained make use of various remarkable mathematical properties of the gamma bridge process, and are applicable to a wide variety of financial products based on cumulative gains processes such as aggregate claims, credit portfolio losses, gross domestic product, emissions, rainfall, and defined benefit pension funds. (Work carried out in collaboration with L.P. Hughston and A. Macrina.)
Luciano Campi - "Hedging with European or American Vanilla Options":
We consider a general incomplete financial market where the agents are allowed to trade not only in the underlying but also in European or American vanilla options so enlarging the set of contingent claims that can be hedged. We study the spanning properties of such a larger set of hedging opportunities.
Umut Cetin - "Joint Conditional Density of a Markov Process and Its Local Time with Applications to Default Risk Modelling":
The stochastic partial differential equation satisfied by the conditional joint density of a Markov process and its local time given a Markovian observation process is found. The results are applied to a credit risk model in order to price a defaultable bond. In particular it is shown that the H-Hypothesis assumed in some credit-risk literature does not hold in general.
Jean-François Chassagneux - "Discrete-Time Approximation of American Option and Game Option Price":
American Option prices and Game Option prices can be represented respectively by simply reflected BSDEs (El Karoui et al. 1997) and doubly reflected BSDEs (Cvitanic and Karatzas, 1996). We then study the discrete time approximation of the solution (Y,Z,K) of such equations. As in Ma and Zhang (2005), we consider a Markovian setting with reflecting barriers of the form h(X) where X solves a forward SDE. We first focus on the discretely reflected case. Based on a new representation for the Z component, which is directly linked to the Delta of the options, we retrieve the convergence result of Ma and Zhang (2005) without their uniform ellipticity condition on X. These results are then extended to the case where the reflection operates continuously both for American Options and Game Options. We also improve the bound on the convergence rate when h \elem C^2_b with Lipschitz second derivative.
Delia Coculescu - "Valuation of Default-Sensitive Claims Under Imperfect Information":
We propose an evaluation method for financial assets subject to default risk, where investors cannot observe the state variable triggering the default, but do observe a correlated price process. The model is sufficiently general to encompass a large class of structural models and can be seen as a generalization of the model of Duffie and Lando (2001). In this setting we prove that the default time is totally inaccessible in the market's filtration and derive the conditional default probabilities and the intensity process. Finally, we provide pricing formulas for default-sensitive claims and illustrate on particular examples the shapes of the credit spreads.
José Manuel Corcuera - "Hedging and Optimization in a Geometric Additive Model":
In our market model the stock price process S = {St, t >= 0} is a geometric additive model. Except for simple cases the above described models are incomplete, we complete the market by a series of assets related to the power-jump processes of the underlying additive process. We will see also how the artificial assets, mentioned above, can be related with call options with different strikes, showing how the market can be completed by using complex portfolios that include call options. Also, by the completeness of the enlarged market, we obtain the optimal portfolio by the martingale method.
Stéphane Crépey - "About the Pricing Equation in Finance":
We establish the well-posedness of reflected BSDE problems with jumps grown out from pricing problems in finance, and we derive the related variational inequality approach. We first construct a rather generic Markovian model made of a jump diffusion X interacting with a pure jump process N (which in the simplest case reduces to a Markov chain in continuous time). The jump process N defines the so-called regime of the coefficients of X, whence the name of Jump–Diffusion Setting with Regimes for this model. Motivated by optimal stopping and optimal stopping game problems (pricing equations of American or Game Contingent Claims, in terms of financial applications), we introduce the related reflected and doubly reflected Markovian BSDEs, showing that they are well-posed in the sense that they have unique solutions, which depend continuously on their input data. We then introduce the system of partial integro-differential obstacles problems formally associated to our reflected BSDE problems. We show that the state-processes (first components Y ) of the solutions to our reflected BSDEs can be characterized in terms of the value functions to the related optimal stopping or game problems, given as viscosity solutions with polynomial growth to the related obstacles problems. We further establish a discontinuous viscosity semisolutions comparison principle (or maximum principle) for our problems, which implies in particular uniqueness of viscosity solutions for these problems. This maximum principle is subsequently used for proving the convergence of stable, monotone and consistent approximation schemes to our value functions.
Delphine David - "On the Optimal Control of Stochastic Delayed Systems with Jumps":
We consider the optimal control of stochastic delayed systems with jumps, in which both the state and controls can depend on the past history of the system. In particular we derive necessary and sufficient maximum principles for such problems, and give some applications to financial economics.
Griselda Deelstra - "Bounds for Asian Basket Options":
In this paper we propose some pricing methods for European-style discrete arithmetic Asian basket options in a Black-Scholes framework. An Asian basket option is an option whose payoff depends on the average value of the prices of a portfolio (or basket) of assets (stocks) at different dates. Determining the price of the Asian basket option is not a trivial task, because we do not have an explicit analytical expression for the distribution of the weighted sum of the assets. By assuming that the assets follow correlated geometric Brownian motion processes, one can use Monte-Carlo simulation techniques to obtain a numerical estimate of the price. In literature, other techniques are suggested with as main goal to approximate the real distribution of the payoffs by another one which is easier to treat mathematically (see e.g. Beisser (Ph.D. Thesis, Johannes Gutenberg University Mainz , 2001) and references therein). In this paper, we start from methods used for basket options and Asian options. First we use the general approach for deriving upper and lower bounds for stop-loss premiums of sums of dependent random variables as in Kaas et al. (IME, 2000) or Dhaene et al. (IME, 2002). We generalize the methods in Deelstra et al. (IME, 2004) and Vanmaele et al. (JCAM, 2006). Afterwards we show how to derive an analytical closed-form expression for a lower bound in the non-comonotonic case. Finally, we derive upper bounds for Asian basket options by generalizing techniques as in Thompson (Working paper, University of Cambridge, 1999) and Lord (JCF, 2007). Numerical results are included and on the basis of our numerical tests, we explain which method we recommend depending on moneyness and time-to-maturity. (Joint work with Ibrahima Diallo and Michèle Vanmaele)
Lucia Del Chicca - POSTER "On the Individual Expectations of Non-Average Investors":
In his article “Options and Expectations” Leland motivates to investigate the individual expectations of investors with an average risk aversion but with investment strategies differing from the average. We give explicit formulas for these individual expectations in a binomial model and we discuss the implications for several cases and examples as well in the path-dependent as in the path-independent case. So for example we completely determine the trading strategies suitable for strictly mean averting investors in the path-independent case.
Freddy Delbaen -
"Monetary Time Consistent Utility Functions and the Viscous Hamilton-Jacobi Quasi-Linear PDE":
Time consistent monetary utility functions (based on Brownian Filtrations) can be characterised by their duality properties. Using the theory of BSDE, the duality theory allows to characterise the cases where the related BSDE have or do not have solutions. In the Markovian case the theory of viscous Hamilton-Jacobi quasi-linear PDE allows to prove that the BSDE has nice solutions for all terminal random random variables that only depend on the final value of the Brownian Motion. I will try to explain why the answer is different in the general case and in the Markovian case. This is joint work with Shige Peng, Rosazza-Gianin, Ying HU and Bao.
Giuseppe Di Graziano - "A Dynamic Approach to the Modelling of Correlation Credit Derivatives Using Markov Chains":
The modelling of credit events is in effect the modelling of the times to default of various names. The distribution of individual times to default can be calibrated from CDS quotes, but for more complicated instruments, such as CDOs, the joint law is needed. Industry practice is to model this correlation using a copula or base correlation approach, both of which suffer significant deficiencies. We present a new approach to default correlation modelling, where defaults of different names are driven by a common continuous-time Markov process. Individual default probabilities and default correlations can be calculated in closed form. As illustrations, CDO tranches with name-dependent random losses are computed using Laplace transform techniques. The model is calibrated to standard tranche spreads with encouraging results.
Alexandra Dias - "Semi-parametric Estimation of Portfolio Tail Probabilities":
In this paper we estimate the probability of occurrence of a large portfolio loss. This accounts for estimating the probability of an event in the far joint-tail of the portfolio loss distribution. These are rare extreme events and we use a semi-parametric procedure from extreme value theory to estimate its probability. We find that the univariate loss distribution is heavy tailed for three market indexes and that there is dependence between large losses in the different indexes. We estimate the probability of having a large loss in a portfolio composed by these indexes and compute the portfolio components weights which minimize the probability of a large portfolio loss. with this procedure we are able to estimate the probability of portfolio losses never incurred before without the need of specifying a parametric dependence model and where increasing the number of portfolio components does not bring complications to the estimation.
Giulia Di Nunno - "Events of Small But Positive Probability and a Version of the Fundamental Theorem of Asset Pricing":
the market modelling is based on a probability space with a probability measure P which is determined in relation to statistical data. on the other hand in mathematical finance the idealization of a "fair" market is based on the belief that the market stochastic behaviour is described by a martingale measure Q. Thus we would like the original measure P to be equivalent to some martingale measure Q, i.e. The two measures should give null-weight to the same events. This topic was largely investigated for quite a long time yielding the fundamental theorem of asset pricing.
Actually, in many applications which are mostly related to the securization of insurance products, extreme events of very small probability play a crucial role, e.g. bankruptcies, earthquakes, floods,... Thus we are not only interested in a market model where the measure P admits an equivalent martingale measure, but we would like also that the two measure give comparable weight to these extreme events of small probability.
In the framework of a continuous time (incomplete) market model, we present a version of the fundamental theorem of asset pricing in which we consider a necessary and sufficient condition for the existence of an equivalent martingale measure with a density lying within pre-considered bounds.
Ernst Eberlein - "Lévy Driven Equity, FX- and Interest Rate Models":
Empirical analysis of data from the financial markets reveals that standard diffusion models do not generate return distributions with a sufficient degree of accuracy. to reduce model risk we study models which are driven by Lévy processes or more general by semimartingales. Analytical properties of this model class are investigated. For implementation in particular the class of generalized hyperbolic Lévy processes is considered. Plain vanilla as well as exotic options are priced on the basis of these models. As a further application in risk management we show that estimates of the value at risk of a portfolio of securities are improved.
In the second part we discuss Lévy term structure models. Three basic approaches to model interest rates are introduced: the forward rate model, the forward process model, and the LIBOR or market model. As an application pricing formulae for caps and floors are derived. Efficient algorithms to evaluate these formulae numerically are given. The LIBOR model can be extended to a multi-currency setting. Closed form pricing formulae for cross-currency derivatives such as foreign caps and floors and cross-currency swaps are studied in detail. The LIBOR model can also be extended to include defaultable instruments. (long version)
Irmingard Eder - POSTER "The Quintuple Law for Sums of Dependent Lévy Processes":
We prove the quintuple law for a general Lévy process $X=X1+X2$ for possibly dependent processes $X1,X2$. The dependence between $X1$ and $X2$ is modeled by a Lévy copula. The quintuple law describes the ruin event of a Lévy process by five quantities: the time of first passage relative to the time of the last maximum at first passage, the time of the last maximum at first passage, the overshoot at first passage, the undershoot at first passage and the undershoot of the last maximum at first passage. We calculate these quantities for some examples and present an application in insurance risk theory. This is joint work with Claudia Klüppelberg.
Romuald Elie - "Optimal Consumption Investment Strategy Under Drawdown Constraint":
We consider the horizon optimal consumption-investment problem under the drawdown constraint, i.e. The wealth process never falls below a fixed fraction of its running maximum. We assume that the risky asset is driven by the constant coefficients Black and Scholes model and we consider a general class of utility functions. on an infinite time horizon, we provide the value function in explicit form, and we derive closed-form expressions for the optimal consumption and investment strategy. The obtention of the solution is based on a duality argument. on a finite time horizon, we interpret the value function as the unique viscosity solution of its corresponding Hamilton-Jacobi-Bellman equation. This leads to a numerical scheme of approximation and allows for a comparison with the explicit solution in infinite horizon.
Christina Erlwein - "Filtering and Optimal Parameter Estimation of a Hidden Markov Model for Electricity Spot Prices":
We develop a hidden Markov model for electricity spot price dynamics, where the spot price follows an exponential Ornstein-Uhlenbeck process with an added compound Poisson process. This way, the model allows for mean-reversion and possible jumps. All parameters are modulated by a hidden Markov chain in discrete time. They are therefore able to switch between different economic regimes representing the interaction of various hidden factors. Through the application of reference probability technique, adaptive filters are derived, which in turn, provide optimal estimates for the state of the Markov chain and related quantities of the observation process. The EM algorithm is applied to find optimal estimates of the model parameters in terms of the recursive filters. We implement this self-calibrating model on a deseasonalized series of daily spot electricity prices from the Nordic exchange Nord Pool. on the basis of one-step ahead forecasts, we found that the model is able to capture the stylised features of Nord Pool spot prices. This is joint work with Fred Espen Benth (CMA, University of Oslo, Norway) and Rogemar Mamon (University of Western Ontario, London, Canada)
Damir Filipovic - "Non-Monotone Risk Measures and Monotone Hulls":
This paper provides some useful results for convex risk measures. In fact, we consider convex functions on a locally convex vector space E which are monotone with respect to the preference relation implied by some convex cone and invariant with respect to some numeraire ("cash"). As a main result, for any function f, we find the greatest closed convex monotone and cash-invariant function majorized by f. We then apply our results to some well-known risk measures and problems arising in connection with insurance regulation.
Markus Fischer - "Discretisation of Continuous-Time Stochastic Optimal Control Problems with Delay":
A semi-discretisation scheme for a class of infinite-dimensional stochastic optimal control problems is introduced. The system dynamics of the control problems are described by stochastic differential equations with delay or "memory" (SDDEs / SFDEs). Performance is measured in terms of an "evolutional" cost functional over a finite time horizon. The coefficients of an SDDE/SFDE depend not only on the current state, but also on past values or entire segments of the solution trajectory. Control problems of this kind arise, for example, as growth models in economics or biology or in finance when pricing and hedging weather derivatives. the approximation scheme consists in the construction of a sequence of finite-dimensional discrete-time optimal control problems in two steps. Under quite natural assumptions, we obtain convergence of the scheme as well as a prior bounds on the discretisation error. Important ingredients in the construction are a version of the Principle of Dynamic Programming and a bound on the second moment of the modulus of continuity of Brownian motion. the question of how to numerically solve the resulting finite-dimensional problems and, in particular, how to deal with the "curse of dimensionality" is also addressed. the results to be presented are based on joint work with Giovanna Nappo (University of Rome "La Sapienza").
Masaaki Fukasawa - "Central Limit Theorem for the Realized Volatility Based on a Tick Time Sampling":
A central limit theorem for the realized volatility estimator of the integrated volatility based on a specific random sampling scheme is proved. the estimator is shown to be also robust to market microstructure noise induced by price discreteness and bid-ask spreads.
Pavel Grigoriev - "Kusuoka's Formula for Dynamic Risk Measures":
In the talk the properties relevant to law-invariance for time consistent dynamic convex risk measures will be discussed. An extension of Kusuoka's representation theorem for static law-invariant risk measures for dynamic case will be proposed.
Laszlo Gyorfi - "Growth Optimal Portfolio Selection Strategies with Transaction Costs":
Discrete time infinite horizon growth optimal investment in stock markets with transaction cost is considered. The stock processes are modelled by homogeneous Markov processes. If the distribution of the market process is known then we show two recursive investment strategies such that, in the long run, the growth rate on trajectories (in "liminf" sense) is larger or equal to the growth rate of any other investment strategy with probability 1.
Andreas Hamel - "Set-Valued Risk Measures":
Jouini et al. (Finance & Stochastics 8, 2004) proposed the concept of set–valued coherent risk measures in order to incorporate market frictions like transaction costs, liquidity bounds etc. In the evaluation of the risk of a portfolio consisting of several assets. We extend this concept to general risk measures with values in the set of all subsets of a finite dimensional space being set–valued translative and monotone functions. For the general case, we establish primal representation results in terms of acceptance sets. For the convex case, we give a complete duality theory parallel to the scalar case including a penalty function representation. This theory is based on extensions of Convex Analysis to set–valued functions which are also new – including definitions of Fenchel conjugates for set–valued functions and a corresponding biconjugation theorem. Another natural question is: How shall we select a single risk canceling element out of a specific value (a set!) of a risk measure in order to cancel the risk of the portfolio under consideration? the answer will be found via a scalarization concept that has an interpretation in terms of prices in one of several possible currencies. A list of examples is given along with ”standarized” procedures (primal and dual) how a known scalar risk measure can be transformed into a set–valued one. In many cases, there is more than one generalization (more or less risk averse) of one scalar risk measure. We shall present set–valued counterparts for (negative) expectation, VaR and AVaR, the (negative) essential infimum and the entropy measure.
Erika Hausenblas - "Existence, Uniqueness and Regularity of Parabolic SPDEs Driven by Poisson Random Measure":
The topic of the talk are stochastic evolutions equations driven by a Poisson random measure and will be based on my works 1, 2 and 3. First, the Poisson random measure will be introduced. Then stochastic evolutions equations, short SPDEs, driven by a Poisson random measure will be discussed. Here, existence and uniqueness results will be presented and integrability properties and the cádlág property will be outlined. We will close with some typical example of such SPDEs.
Martin Hillebrand - "Dynamic Loss Modeling for Heterogeneous Credit Portfolio":
Extant models for portfolio losses with heterogeneous default rates and heterogeneous exposure sizes have an important shortcoming viz. they require computationally expensive Monte Carlo simulations. CreditRisk+ is a notable exception that allows for computation of the loss distribution analytically. with moderate restrictions on dependency modelling, it uses generating functions to compute the loss probabilities quickly and accurately. However, this advantage is overshadowed by the fact that it is a static single period model. This is a major drawback when working with portfolio exposures having different maturities and when pricing instruments where the term structure of default rates matters. The framework proposed here incorporates time varying default rates and volatilities that may differ across names as well. Equally important is the fact that this framework does not require exposure banding (as CreditRisk+ does). The evolution of loss distribution with time can be modelled using the Cox-Ingersoll-Ross processes as latent macroeconomic processes driving the dynamic default intensities. The characteristic function of the credit portfolio loss can be obtained explicitly. Using the Fast Fourier Transform it can be inverted to obtain the portfolio loss distribution in a numerically stable manner. It may be possible to extend the setup we propose to incorporate stochastic LGD and EAD; this is the focus of ongoing work. (Joint talk with Ashay Kadam)
Xinzheng Huang - "Adaptive Integration for Multi-Factor Portfolio Credit Loss Model":
We propose algorithms of adaptive integration for calculation of the tail probability in multi-factor credit portfolio loss models. We first devise the Genz-Malik rule, a deterministic multiple integration rule suitable for portfolio credit models with number of factors less than 10. Later on we arrive at the adaptive Monte Carlo integration, which simply replaces the deterministic integration rule by pseudo-random numbers. The latter not only can handle higher dimensional models but also is able to provide reliable probabilistic error bounds. Both algorithms are asymptotic convergent and consistently outperform the plain Monte Carlo method.
Bogdan Iftimie - "Asymptotic Behaviour of Piece-Wise Continuous Solutions of S.D.E.":
Stochastic differential equations with jumps are considered and the analysis is focussed on describing the weak limit set provided a Lyapunov exponent is used and the continuous component is asymptotically stable in mean square.
Jens Jackwerth - "Are Options on Index Futures Profitable for Risk Averse Investors?":
American call and put options on the S&P 500 index futures that violate the stochastic dominance bounds of Constantinides and Perrakis (2007) are identified as potentially profitable investment opportunities. In out-of-sample tests over 1983-2006, trading strategies that exploit these violations are shown to increase the expected utility of any risk averse investor, net of transaction costs and bid-ask spreads.
Ying Jiao - "Dynamical Modelling of Successive Defaults":
We propose a new approach, based on the density process of the F-conditional survival probabilities where F represents some background filtration, to study the successive defaults in a dynamical way. We emphasize on the case "after-default" and on the necessity to model the density process a(t,u) for any positive t,u, particularly for u<t. We also explain the relationship between the density process and the classical notion --- the intensity process. in this framework, we are able to calculate all G-conditional expectations, notably on the after-default event, where G is the global market filtration which contains the default information and is strictly larger then F. Furthermore, the framework can be extended naturally to successive defaults in a systematic and recursive way. The difficulty related to the filtrations with jumps can be surpassed and the problem is reduced to model a joint density process with respect to the filtration F which we can suppose to possess "good" regularity property. The contagious default phenomenon can be explained intrinsically from the conditional dependence among credits on the filtration F (similar ideas appeared in Schoenbucher and Schubert (2001)). Finally, we apply this approach to two main types of credit portfolio derivatives --- kth-to-default basket swaps and CDOs. In fact, for both products, it suffices to study ordered defaults. (Joint work with N. El Karoui and M. Jeanblanc)
Ashay Kadam - "Dynamic Loss Modeling for Heterogeneous Credit Portfolio":
Extant models for portfolio losses with heterogeneous default rates and heterogeneous exposure sizes have an important shortcoming viz. they require computationally expensive Monte Carlo simulations. CreditRisk+ is a notable exception that allows for computation of the loss distribution analytically. with moderate restrictions on dependency modelling, it uses generating functions to compute the loss probabilities quickly and accurately. However, this advantage is overshadowed by the fact that it is a static single period model. This is a major drawback when working with portfolio exposures having different maturities and when pricing instruments where the term structure of default rates matters. The framework proposed here incorporates time varying default rates and volatilities that may differ across names as well. Equally important is the fact that this framework does not require exposure banding (as CreditRisk+ does). The evolution of loss distribution with time can be modelled using the Cox-Ingersoll-Ross processes as latent macroeconomic processes driving the dynamic default intensities. The characteristic function of the credit portfolio loss can be obtained explicitly. Using the Fast Fourier Transform it can be inverted to obtain the portfolio loss distribution in a numerically stable manner. It may be possible to extend the setup we propose to incorporate stochastic LGD and EAD; this is the focus of ongoing work. (Joint talk with Martin Hillebrand)
Linus Kaisajuntti - "An N-Dimensional Markov-Functional Model":
This paper develops an n-dimensional LIBOR Markov-functional interest rate model using in effect the same techniques as for lower-dimensional Markov-functional models, but under a slightly different setup. This means formulating the model using forward induction under the Spot measure instead of backward induction under the Terminal measure and using the Monte Carlo method instead of more efficient numerical integration and lattice methods. However, despite the use of the Monte Carlo method it turns out that the proposed n-dimensional Markov-functional model is significantly more efficient than it's LIBOR market model counterpart and is very well suited for certain type of path-dependent derivatives. Moreover, the n-dimensional Markov-functional model provides a powerful framework for analysing Markov-functional models. In addition to Bennet & Kennedy (2005), who shows that one-factor LIBOR market models and one-dimensional Markov-functional models are very similar, we perform tests comparing the n-dimensional versions over a variety of market conditions. The tests confirm major similarities and we argue that the intuition gained from the LIBOR market model SDE will always be applicable, irrespective of dimensions.
Stefanie Kammer - "Credit Spread Volatility Under a First Passage Time Model":
A credit default swap (CDS) offers protection against default of a reference entity. Therefore the protection buyer regularly pays an insurance fee, the credit spread, but only as long as no default has happened. In case of default before contract maturity, the protection seller pays a claim amount as agreed to the protection buyer. By now CDSs are the most liquid credit-risky market instruments. Considering CDS contracts with various times to maturity leads to a whole credit spread term structure. Credit spread curves vary stochastically over time in level and shape. A model that reflects credit spread dynamics is crucial for pricing any derivatives on credit spread (such as credit baskets, credit spread options and CDOs), especially when considering longer maturities. Up to now credit spread models are adapted to the market’s credit spread term structure only, but do not integrate credit spread volatility. For example, the deterministic time change model by Overbeck & Schmidt [1] can perfectly fit the credit spread curve, but contradicts empirical behavior of credit spread volatility. Our concern is not a perfect fit of today’s market curve, but a model that can be adapted to both, credit spread term structure and credit spread volatility. Within a structural approach we consider the general class of stochastic time change models where Brownian motion is time changed by an absolutely continuous process. We derive an analytical first passage time distribution (FPTD) for the one-dimensional model and also for the two-dimensional model under additional as- set correlation, by applying a result of Zhou [2]. For the multi-dimensional model a FPTD is yielded under a simpler dependence structure arising from the same time change. Using our analytical FPTD we can derive credit spread dynamics via Itô’s rule. Our joint and multivariate default probabilities can be applied for pricing credit derivatives that dependent on several names. FPTDs are also important in other applications such as barrier option pricing. We provide an example of a specific time change model and show how it can be calibrated to credit spread term structure and credit spread volatility.
Ioannis Karatzas - "Stochastic Portfolio Theory: A Survey":
We shall present an overview of Stochastic Portfolio Theory, a rich framework for analyzing portfolio behavior and equity market structure. This theory was developed in the book by Fernholz (Springer, 2002) and the articles Fernholz (JME, 1999), Fernholz, Karatzas and Kardaras (Finance & Stochastics, 2005), Fernholz and Karatzas (Annals of Finance, 2005). It is descriptive as opposed to normative, is consistent with observable characteristics of actual portfolios, and provides an important theoretical took for practical applications.
As a theoretical tool, this framework provides fresh insights into questions of market equilibrium and arbitrage, and can be used to construct portfolios with controlled behavior. As a practical tool, it has been applied to the analysis and optimization of portfolio performance and has been the basis of very successful investment strategies at the institutional portfolio management firm INTECH for close to 20 years.
Topics to be covered include: Growth Rates, Functionally-Generated Portfolios, Diversity and Intrinsic Volatility, Relative Arbitrage, Growth Optimal Portfolios, Rank-based Equity Market Structure (download the full paper [PDF])
Rüdiger Kiesel - "Pricing Forward Contracts in Power Markets by the Certainty Equivalence Principle: Explaining the Sign of the Market Risk Premium":
In this paper we provide a framework that explains how the market risk premium, defined as the difference between forward prices and spot forecasts, depends on the risk preferences of market players. In commodities markets this premium is an important indicator of the behaviour of buyers and sellers and their views on the market spanning between short-term and long-term horizons. We show that under certain assumptions it is possible to derive explicit solutions that link levels of risk aversion and market power with market prices of risk and the market risk premium. The paper is available at http://papers.ssrn.com/sol3/papers.cfm?abstract_id=941117
Rolf Klaas - "A Structural Multi Issuer Credit Risk Model Based on Square Root Processes":
In this first passage time approach the ability to pay process of an obligor is composed of two independent basic affine processes without jump component. This is exemplified for the squared Bessel processes. The default barrier is defined to be equal to zero. In order to introduce dependence between different obligors, similar to the Basel II one factor model, one process is common to all obligors belonging to the same industry sector, whereas the second process represents the idiosyncratic component. The common process is a stopped process, where the associated stopping time is the first hitting time of zero. Since the default time is the first time the credit quality process hits zero, a default can only happen after the common process has reached zero. The default probabilities, joint default probabilities and default correlations are compared to other multivariate credit risk models like Gaussian copula models. Extensions to a multi-factor model are discussed.
Irene Klein - "Market Free Lunch and Large Financial Markets":
Frittelli (2004) introduced a notion of market free lunch depending on the preferences of investors. I will show the relation to the classical no free lunch condition of Kreps (1981) using the theory of Orlicz spaces. Moreover I will define an asymptotic version of no market free lunch on a large financial market which turns out to be equivalent to no asymptotic free lunch. Further, one can show directly that no asymptotic market free lunch is equivalent to the existence of an equivalent (local-/sigma-) martingale measure for the large financial market.
Claudia Klüppelberg - "The Continuous-Time GARCH Model":
We introduce a continuous-time GARCH [COGARCH(1,1)] model which, driven by a single Lévy noise process, exhibits the same second order properties as the discrete-time GARCH(1,1) model. Moreover, the COGARCH(1,1) model has heavy tails and clusters in the extremes. The second order structure of the COGARCH(1,1) model allows for some estimation procedure based on the ARMA(1,1) autocorrelation structure of the model and other moments. The model can be fitted to high-frequency data, and the statistical analysis also provides an estimator for the volatility.
Przemyslaw Klusik - "Optimal Strategy for An Investor with Access to a Stream of Extra Information":
We consider financial market models with two agents on different information levels: a regular agent whose information is contained in the natural filtration of price process, and an insider who additionally has access to a stream of extra information. by means of Malliavin calculus we construct the strategy maximizing the expected utility of insider. The mathematical results obtained are illustrated by proper computer simulations.
Florian Kramer - "Risk and Valuation of Mortality Contingent Catastrophe Bonds":
Catastrophe Mortality Bonds are a recent capital market innovation providing insurers and
reinsurers with the possibility to transfer catastrophe mortality risk off their balance sheets to
capital markets.
We introduce a time-continuous model for analyzing and pricing catastrophe mortality
contingent claims based on stochastic modeling of the force of the mortality. The model
consists of two components: a baseline component governed by a diffusion reflecting the"regular" fluctuations of mortality over time and a catastrophe component modeled by a Non-Gaussian Ornstein-Uhlenbeck process representing catastrophic events.
Historical and risk-adjusted parameterizations of the proposed model based on three different
calibration procedures are derived. The resulting loss profiles and prices are compared to loss
profiles provided by the issuers and to market prices, respectively. We find that the profiles
are subject to great uncertainties and should hence be considered with care by investors and
rating agencies.
(Joint work with Daniel Bauer)
Dmitry Kramkov - "A Model for a Large Investor Trading At Market Indifference Prices":
We present an equilibrium-based model for large economic agent where she trades with the market makers at their utility indifference prices. We perform an asymptotic analysis of this model for the case of "small" investor and derive as a result a liquidity correction to the prices of non-traded derivatives. The presentation is based on a joint project with Peter Bank.
Christoph Kühn - "Illiquid Financial Markets and Nonlinear Stochastic Integrals":
A large investor is somebody whose trades move market prices significantly. Put differently, he is faced with an illiquid financial market. Whereas in the standard liquid market model trading gains are modelled by linear stochastic integrals, in the more general illiquid case nonlinear stochastic integrals are required. We discuss the construction of nonlinear stochastic Itô-integrals w.r.t. A family of semimartingales which depend on a spatial parameter. In particularly, we are interested in the case that the dependency on the spatial parameter may be discontinuous. We investigate under which conditions a nonlinear integral can be approximated by nonlinear integrals with piecewise constant integrands. This brings us beyond the case that any integral can be approximated by integrals with integrands taking only finitely many values. Furthermore, we discuss the large investor's utility maximization problem and compare its solution with the optimal strategy for small investors.
Alexander Kulikov - "Multidimensional Coherent and Convex Risk Measures":
First the notion of coherent risk measure was introduced in the landmark
paper [1] by Artzner, Delbaen, Eber and Heath. Since those papers, the
theory of coherent risk measures has been evolving rapidly. But only one-dimensional risk measures are under consideration, i.e. They measure risk
of one-dimensional random variables, which from the financial point of view
are the prices of portfolios in base currency. This approach is valid if we
have such currency. However, it is not valid, for example, when we describe
the portfolio consisting of some currencies, because there is no "canonical"
currency in such a case. In this case it is more natural to use multidimensional
approach given by Kabanov in [3]. The notion of multidimensional coherent
risk measure was introduced in [2] by Jouini, Meddeb, Touzi. Their approach
aims to take into account transactional costs while exchanging one currency
to another. But in their model transactional costs are not random. So they
do not take into account risk connected with changing of currency exchange
rates that is one of the most important risks nowadays.
Here we introduce the notion of multidimensional coherent risk measure
which takes into account this type of risks. This approach is similar to the
approach considered in the paper [2], but the matrix of currency exchange
rates is random. Besides the task of risk measurement, the task of the allocation of risk between some parts of portfolio is also very important. This
problem is closely connected with the problem of risk contribution. Also
we give the solution of these problems in terms of multidimensional coherent
risk measures. The solution is determined via the notion of multidimensional
extreme element.
References:
[1] P. Artzner, F. Delbaen, J.-M. Eber, D. Heath. Thinking coherently.
Risk, 10 (1997), No. 11, p. 68-71.
[2] E. Jouini, M. Meddeb, N. Touzi. Vector-valued coherent risk measures.;
Finance and Stochastics, 8 (2004), p. 531-552.;
[3] Yu.M. Kabanov. Hedging and liquidation under transaction costs in currency markets. Finance and Stochastics, 3 (1999), No. 2, p. 237-248.
Damien Lamberton - "Optimal Stopping Problems with Irregular Payoff Functions":
We consider an optimal stopping problem with finite horizon for a general one-dimensional diffusion, without any regularity assumption on the payoff function. We prove that the value function is continuous and can be characterized as the unique solution of the variational inequality in a weak sense. This talk is based on joint work with M. Zervos.
Kasper Larsen - "Continuity of Utility-Maximization with Respect to Preferences":
This paper provides an easily verifiable regularity condition under which the investor's utility maximizer depends continuously on the description of her preferences in a general incomplete financial setting. Specifically, we extend the setting of Jouini and Napp (2004) to
1) noise generated by a general continuous semi-martingale and
2) the case where the market price of risk is allowed to be a general adapted process satisfying a mild integrability condition. This extension allows us to obtain positive results for both the mean-reversion model of Kim & Omberg (1996) and the stochastic volatility model of Heston (1993). Finally, we provide an example set in Samuelson's complete financial model illustrating that without imposing additional regularity, the continuity property of the investor's optimizer can fail.
Peter Laurence - "Hedging and Pricing of Generalized Spread Options and the Market Implied Comonotonicity Gap":
In two papers, joint with Tai-Ho Wang, we provide optimal super replicating strategies for generalized spread options. These are basket options with weights that can be either positive or negative . We also provide optimal subreplicating strategies. These lead to a new trading strategy, based on the difference between the market implied co- or antimonotonic price and the traded price.
Semyon Malamud - "A Unified Approach to Market Incompleteness":
Until now, the principle obstacle to "getting one's hands on" the equilibria for incomplete markets was a "concrete" solution to the utility maximization problem for a single agent. We have discovered a general method for constructing explicit solutions to a wide variety of optimization problems that are associated with a general class of incomplete markets. Effectively, our class characterizes the macroeconomic and finance models of incomplete markets that satisfy the Keynes axiom of decreasing marginal propensity to consume. It includes all classic incomplete market models. Our construction is the first new ingredient that makes it possible to extract detailed economic information from the equilibria of our models. We directly apply our method to disprove several well known macroeconomic conjectures. Our method is also perfectly suited for utility indifference pricing. We explicitly calculate the derivatives of the utility indifference price, analyze its asymptotic bahavior for large/small payoffs and establish sharp global estimates for the price.
Gabriel Maresch - POSTER "Optimality and Monotonicity in the Monge-Kantorovich Optimal Transportation Problem":
It is known that for cost functions which are either lower semi-continuous and finite or continuous and possibly infinite, transport plans which are concentrated on a c-cyclically monotone set are automatically optimal. We present related results which hold in a more general context, namely for Borel measurable cost functions. Concepts similar to c-cyclically monotonicity such as robust optimality and strong c-monotonicity are explored further and given an economical interpretation. (Joint work with M. Beiglboeck, M. Goldstern and W. Schachermayer, TU Vienna)
Jan Maruhn - "Robustifying Static Hedges for Barrier Options Against Dynamics of the Volatility Surface":
Since static hedge portfolios for barrier options consist of several standard options, these portfolios are strongly exposed to movements of the volatility surface over time. In this talk we present a new static super-replication approach which leads to robust portfolios guaranteeing the hedge performance for an infinite number of future volatility surface scenarios. As it turns out, the hedge can be computed by solving a suitable semi-infinite optimization problem. After proving existence-, convergence- and duality results, we apply the method to real world data and obtain market-typical super-hedges with surprisingly low cost.
Koichi Matsumoto - "Mean-Variance Hedging in an Illiquid Market":
We study a hedging problem of a contingent claim in a discrete time model, where the contingent claim is hedged by one illiquid risky asset. in our model, the investor cannot always trade the illiquid asset, though he can always observe or estimate the price of the illiquid asset. in other words, the trade times are not only discrete but also random. Our model is a kind of random trade time model studied by Rogers and Zane (2002) and Matsumoto (2003, 2006). in this setting the perfect hedging is difficult and then we measure the hedging error by a quadratic criterion. An outline of our study is as follows. First we fix the initial conditions. We show the existence of the optimal hedging strategy. and we give the optimal strategy as a recursive formula. Secondly we consider the optimization of the initial conditions. in the illiquid market, it is not easy to change the portfolio and then the initial conditions are important. We express the optimal initial conditions simply, using the signed measure. Finally we consider a one-period binomial model by way of example.
Philipp Mayer - "Stable Calibration Methods for Financial Market Models of Local Lévy Type":
In the last decade many research activities were undertaken to find stable calibration methods for financial market models to fit an observed option price surface. In particular the calibration of the Dupire model attracted much attention, as it is able to fit the marginals of any Itô process. the problem of fitting the local volatility function to the option surface is ill-posed and hence some effort was taken to regularize this problem (e.g. using Tikhonov-regularization as in Crepey [3] or Egger & Engl [4]. As the Dupire model is known to perform poorly when it is used for the pricing of path-dependent options in recent years more general models with a similar degree of variability were introduced in the literature (for instance the local Lévy model proposed by Carr et al. [2] or a local volatility model with jumps as considered by Andersen & Andreasen in [1]). One advantage of such models is, for example, that skewed log-returns can also be introduced via the jump term (instead of solely by means of the local volatility function). However, the calibration of these models requires amongst other things the identification of generalized “local volatility” and so-called “local speed” functions. in this talk we present a non-parametric stable calibration method based on Tikhonov regularization for such generalized Lévy market models. While the original calibration problem is more ill-posed than the Dupire calibration problem, we are able to prove stability and convergence of the regularized problem and in some cases convergence rates can be derived under the common assumption of an abstract source condition. Finally we underpin the theoretical results by numerical illustrations.
Huseyin Merdan - "Asset Price Dynamics with Heterogenous Groups":
This talk presents the study of the price dynamics of an asset under various conditions by using a system of ordinary differential equations. One of these conditions involves the introduction of new information that is interpreted differently by two groups. Another studies the price change due to a change in the number of shares. We will examine the steady state under these conditions to determine the changes in the price due to these phenomena. Numerical studies are also shown to understand the transition between the regimes. The differential equations naturally incorporate the effects due to the finiteness of assets (rather than assuming unbounded arbitrage) in addition to investment strategies that are based on either price momentum (trend) or valuation considerations.
Amal Merhi - "Irreversible Capacity Expansion with Proportional and Fixed Costs":
We consider the problem of determining the optimal capacity expansion strategy that a firm operating within a random economic environment should adopt. We model market uncertainty by means of a geometric Brownian motion. the objective is to maximise a performance criterion that involves a general running payoff function and associates a fixed and a proportional cost with each capacity increase. the resulting optimisation problem takes the form of a two-dimensional impulse control problem that we explicitly solve.
Yoshio Miyahara - "Option Pricing Based on the Geometric Stable Processes and the Minimal Entropy Martingale Measures":
The geometric stable processes have been focused on as one of the models for the underlying asset price processes which have the strong fat tail properties, and have been studied by many researchers. In this paper we study the GSP (geometric stable process) & MEMM (Minimal Entropy Martingale Measure) Pricing Models. We see that by adopting the MEMM as the suitable martingale measure we can construct the option pricing models based on the stable processes in general form, and that these models have the reproducibility of both the volatility smile and the volatility skew properties. Next we apply this model to the currency options and carry on the empirical analysis, and we have obtained that this model is fitting very well to the market prices of currency options.
Johannes Muhle-Karbe - "Portfolio Optimization Under Transaction Costs":
We consider the problem of maximizing expected logarithmic utility from consumption in a Black-Scholes market with proportional transaction costs. A solution to this problem has been obtained by Davis and Norman (1990) using methods from stochastic control theory. Similar arguments are also used in most of the other work that aims for numerically tractable solutions in this field. We present a different approach here. By using a "shadow price process" instead of the original one, we show how this problem can be reformulated as an optimal consumption problem in a frictionless market. (Joint work with Jan Kallsen)
Agatha Murgoci - "Vulnerable Options and Good Deal Bounds - Structural Model":
We price vulnerable options - i.e. options where the counterparty may default. The main reason for having a counterparty risk is the fact that these options are traded over-the-counter (OTC). According to BIS, the OTC equity-linked option gross market value in the first half of 2006 USD 6.8 tln. While previous literature models vulnerable options in complete markets, we notice this is a case of market incompleteness. We streamline earlier results in complete markets and, in order to price in incomplete markets, we employ the technique of good deal bounds as developed by Björk-Slinko (2005). We model default in a structural framework and obtain close form solutions for the pricing bounds. Also, we extend the results for European calls to other options and homogeneously linear payoffs, such as exchange options. The price bounds obtained are much tighter than the no-arbitrage bounds.
Marek Musiela - "Implied Preferences and Bespoke Portfolios":
We consider an investment problem with a performance criteria which combines the investor preferences with the market related inputs. Consequently, the optimal portfolio generates the wealth process which contains implicit information about the preferences. In this paper we show how to learn about these preferences by analysing the properties of the optimal wealth process. For example, we show, under the assumption of deterministic market price of risk, that the specification of the mean of the optimal wealth process determines the investor preferences and implicitly his bespoke portfolio. (Joint work with Thaleia Zariphopoulou).
Christina Niethammer - "On Q-Optimal Signed Martingale Measures in Exponential Lévy Models":
We give a sufficient condition to identify the q-optimal signed martingale measures in exponential Lévy models. As a consequence we find that the q-optimal signed martingale measures can be equivalent only, if the tails for upward jumps are extraordinarily light. Moreover, we derive convergence of the q-optimal signed martingale measures to the entropy minimal martingale measure as q approaches one. Finally, some implications for portfolio optimization are discussed.
Jan Obloj - "Completing Market Using Options: Necessary and Sufficient Conditions":
We consider the question of market completeness when we can use for hedging the underlying together with some (liquidly traded) options. We follow the general approach of Romano and Touzi (1997) and Davis (2004). We assume a general multidimensional diffusion model (in particular a stochastic volatility model). We do not specify the dynamics of options prices exogenously but assume we work under the pricing measure so that assets' prices are given as discounted conditional expectations of their payoffs. We are interested in the following question: can we achieve any contingent claim as a final value of a trading strategy in the underlying and the given set of liquidly traded options? We give a necessary and sufficient condition for this to hold which generalizes upon sufficient conditions given in the previous works. We then extend this to jump-diffusion models. (Joint work with Mark Davis.)
Bernt Øksendal - "Optimal Portfolio for an Insider in a Strategic Market Equilibrium":
We study a market model which is a generalization of the insider equilibrium models of Kyle and Back. We use filtering theory, anticipative stochastic calculus (forward integrals) and stochastic control to find the optimal portfolio for an insider in a market where the action of the insider influences the price. The presentation is based on joint work with Knut Aase and Terje Bjuland.
György Ottucsák - "Principal Component and Constantly Rebalanced Portfolio":
A class of portfolio selection methods, the so called log-optimal Constantly Rebalanced Portfolio (CRP) is considered in discrete time model of sequential investments, which means that at the beginning of each trading period the capital of the investor is distributed among the assets according to a fixed portfolio vector. Beside the empirical analysis of the well-known best CRP (BCRP) (obtained by assuming perfect knowledge of future stock prices) on 44-years long New-York Stock Exchange (benchmark) data sets two other CRPs are introduced and empirically analyzed: the causal CRP (CCRP) which recalculates its portfolio at the beginning of each trading period and a variant of the log-optimal BCRP, which has smaller computational complexity, and has some relations to Markowitz CRP, where the expected return is maximized for a given level of risk. to achieve these strategies complexity feasible algorithms based on quadratic programming and gradient descent are given. Some corresponding materials to the log-optimal phenomena and to the benchmark data sets are available on http://www.szit.bme.hu/~oti/portfolio.
Ludger Overbeck - "Risk Measures for Structured Credit Products":
In the paper we first present the basic mathematical modeling features of Collaterized Debt Obligations. Here we will concentrated on synthetic products which are based on Credit Default Swaps. Then the standard "Greeks" are examined. Different from most standard option products, the Gamma is not very informative. The non-linearity is usually measured in terms of gap- and jump-to-default risk. We present some analytic recursion methods to calculate this figures. The main focus of the paper however are the portfolio dependent risk measures, like spread and risk contributions. These measures are nowadays well known and easy to understand in linear portfolios. Portfolios of CDOs exhibit a non-linear structure. But we will show how to overcome these difficulties, in particular in the context of spread contributions. As a final topic we will show that spectral risk measures are more appropriate for the risk analysis of portfolios of CDOs than measures analyzing only a specific part of the loss distribution, like Value-at-Risk and Expected Shortfall. Especially, the role of the risk aversion weight function becomes transparent.
Natalie Packham - "Modelling Credit-Spread Dynamics in a Hitting-Time Model":
Standard hazard rate models for credit default times capture jumps to default, but in general, we expect a credit to deteriorate over time before it defaults. This should be reflected in a credit model, as, for instance, the valuation of some credit instruments depends on jumps of the underlying credit spread (an example is the Leveraged Credit-Linked Note). Furthermore, the valuation of options on credit derivatives requires the specification of the underlying credit derivative's dynamics through time. We present a tractable hitting-time model that is based on a Wiener process with a stochastic time change. The time change is a Lévy-driven Ornstein-Uhlenbeck process, which reflects the observation that credit spreads move up sharply and then tend back gradually over time. We present an efficient technique for computing default probabilities and conditional default probabilities numerically. The calibration to market-given data is investigated with a particular focus on the economic interpretation of the parameters involved (Joint work with Lutz Schlögl, Quantitative Credit Research, Lehman Brothers).
Mikko Pakkanen - POSTER "A Functional Limit Theorem for a Marked Point Process Model of Asset Price Fluctuations":
We propose an agent-based financial market model in continuous time, in which actions of individual agents are modeled using marked point processes. The price formation mechanism in the model is built on the microeconomic equilibrium framework of Föllmer and Schweizer (1993), however taking into account the asynchronous nature of trading driven by the marked point processes. In a special case of this model, with the market consisting of fundamentalists and noise traders, we show that price fluctuations converge in distribution to a diffusion process as the number of agents in the market tends to infinity.
Jan Palczewski - "On the Wealth Dynamics of Self-financing Portfolios under Endogenous Prices":
In this paper we study market selection and survival of self-financing trading strategies in a continuous-time market model with endogenous prices. This model builds on the common continuous-time model in mathematical finance. The prices however are derived from the demand and supply of traders, whose decisions are influenced by an exogenous dividend process. Our approach promotes an evolutionary point of view that abstracts from utility functions. It borrows from a common knowledge of economic theory that states that market pressures eventually select those traders who are better adapted to the prevailing conditions. The main result is on the survival of trading strategies. We show that there is a single surviving trading strategy among all constant strategies. Traders following this strategy eventually accumulate all the wealth of the market. Our findings are surprising in the view of earlier studies of related models with discrete time as our model lacks exponential stability and cannot be treated by well-developed theory of Stochastic Dynamical Systems. This paper is a joint work with Jesper Lund Pedersen (Copenhagen) and Klaus Reiner Schenk-Hoppé (Leeds).
Antonis Papapantoleon - "On the Duality Principle in Option Pricing: Semimartingales and Lévy Processes":
The duality principle states that the calculation of the price of a call option in a model with price process S=e^H (w.r.t. A measure P) is equivalent to the calculation of the price of a put option in a dual model S'=e^H' (w.r.t. A dual measure P'). We develop the appropriate mathematical tools for the study of the duality principle in a general semimartingale setting. We consider both uni- and multi-dimensional semimartingale models, thus covering the case of options on several assets as well. a number of more sophisticated duality results are derived for a broad spectrum of exotic options. The duality principle demonstrates its full strength for these options as, in several cases, it allows to reduce a problem involving joint distributions to a univariate problem. Particular cases which are studied are models driven by Lévy processes. Time permitting, we will also sketch valuation methods for exotic options in Lévy models. the talk is based on joint work with Ernst Eberlein and Albert N. Shiryaev.
Jostein Paulsen - "Optimal Dividend Payments and Reinvestments of Diffusion Processes with Both Fixed and Proportional Costs":
Assets are assumed to follow a diffusion process subject to some conditions. the owners can pay dividends at their discretion, but whenever assets reach zero, they have to reinvest money so that assets never go negative. with each dividend payment there is a fixed and a proportional cost, and so with reinvestments. the goal is to maximize expected value of discounted net cash flow, i.e. dividends paid minus reinvestments. It is shown that there can be two different solutions depending on the model parameters and the costs.
1. Whenever assets reach a barrier they are reduced by a fixed amount through a dividend payment, and whenever they reach 0 they are increased by a fixed amount by a reinvestment.
2. There is no optimal policy, but the value function is approximated by policies of the form described in Item 1 for increasing barriers. We provide criteria to decide whether an optimal solution exists, and when not, show how to calculate the value function. It is discussed how the problem can be solved numerically and numerical examples are given.
Teemu Pennanen - "Pricing and Hedging in Convex Markets":
We study pricing and hedging of contingent claims in financial markets where trading costs are given by convex cost functions and portfolios are constrained by convex sets. The model does not assume the existence of a cash account. In addition to classical frictionless markets and markets with transaction costs or bid-ask spreads, our framework covers markets with nonlinear illiquidity effects for large instantaneous trades.
Irina Penner - "Dynamic Convex Risk Measures: Time Consistency, Prudence and Sustainability":
We study various properties of a dynamic convex risk measure for bounded random variables. Our main issue is to investigate possible interdependence of conditional risk assessments at different times and the manifestation of these time consistency properties in dynamics of penalty functions and risk processes. We begin by focusing on the strong notion of time consistency and we characterize this property in terms of penalty functions and a joint supermartingale property of the risk measure and its penalty function. This part of the talk is based on a joint work with Hans Föllmer. In the second part of the talk we introduce and characterize a weaker notion of time consistency that we call prudence. Prudent risk measures induce risk processes that can be upheld without any additional risk. We call such processes sustainable, and we give an equivalent characterization of sustainability in terms of a combined supermartingale property of a process and one-step penalty functions. This supermartingale property allows us to characterize the strongly time consistent risk measure which arises from any dynamic risk measure by recursive construction as the smallest process that is sustainable and covers the final loss.
Georg Pflug - "Pricing of Swing Options and Stochastic Games":
Credit portfolios, as for instance Collateralized Debt Obligations (CDO's) consist of credits that are heterogeneous both with respect to their ratings and the involved industry sectors. Estimates for the transition probabilities for different rating classes are well known and documented. We develop a coupled Markov Chain model, which uses the transition probability matrix as the marginal law plus event correlation coefficients within and between industry sectors and between rating classes to find the joint law of migration of all components of the portfolio, even for large portfolios. We avoid to use firm value correlations as proxies for event correlations. the empirical part of the study is based on a data set containing 10413 time series of ratings of individual firms from 30 OECD countries with a period of ten years. In this period, 639 defaults were observed. We show how the number of defaults in large portfolios may be approximated by mixtures of multinomial variables, which in turn can be approximated by mixtures of normal variables. We show how the percentiles of the number of defaults depend on the inter-sectoral and intra- sectoral event correlations.
Petra Posedel - POSTER "Asymptotic Analysis for a Simple Explicit Estimator in Barndorff-Nielsen and Shephard Stochastic Volatility Model":
We provide a simple explicit estimator for discretely observed Barndorff-Nielsen and Shephard models, prove consistency and asymptotic normality based on the single assumption that all moments of the stationary distribution of the variance process are finite, and give explicit expressions for the asymptotic covariance matrix. We develop in details the martingale estimating function approach for a bivariate model, that is not a diffusion, but admits jumps. We do not use ergodicity arguments. We assume that both, logarithmic returns and instantaneous variance are observed on a discrete grid of fixed width, and the observation horizon tends to infinity. This analysis is a starting point and benchmark for further developments concerning optimal martingale estimating functions, and for theoretical and empirical investigations, that replace the (actually unobserved) variance process with a substitute, such as number or volume of trades or implied variance from option data. This is joint work with Friedrich Hubalek, Technische Universität Vienna.
Wolfgang Putschögl - "Optimal Investment Under Dynamic Risk Constraints and Partial Information":
We consider an investor who wants to maximize expected utility of terminal wealth. A typical model for stock prices is provided by a stochastic differential equation with non-constant coefficients. If the drift process is e.g. Independent of the driving Brownian motion this leads to a model with partial information under the realistic assumption that only the prices can be observed. For special models corresponding strategies can be computed but due to the non-constant drift the position in the stock varies between extreme long and short positions making these strategies very risky when trading on a daily basis. Motivated by Cuoco et al. (2002) we impose a (different) class of risk constraints on the strategy, computed on a short horizon, and then find the optimal policy in this class. This leads to much more stable strategies. We provide an example, where the drift process is modeled as a continuous time Markov chain with finitely many states. the risk constraints not only limit the risk caused by time discretization, they also reduce the influence of certain parameters which may be difficult to estimate. We provide a detailed sensitivity analysis for the parameters involved in the strategy and how they affect the strategies in the constrained and unconstrained case. The results are applied to historical stock prices.
Teppo Rakkolainen - "Optimal Dividend Control in Presence of Downside Risk":
We analyze the determination of a value maximizing dividend policy for a broad class of cash flow processes modeled as spectrally negative jump diffusions. We extend previous results based on continuous diffusion models and characterize the value of the optimal dividend policy explicitly. Utilizing this result, we also characterize explicitly the values as well as the optimal dividend thresholds for a class of associated optimal stopping and sequential impulse control problems. Our results indicate that both the value as well as the marginal value of the optimal policy are increasing functions of policy flexibility in the discontinuous setting as well.
Jérôme Reboulleau - "Pricing Shipping Derivatives Through the Lévy Market Model":
Historically, freight rates are more volatile than other commodities. Prices for shipping contracts often vary by more than 60% in the course of a single year and the various actors in this industry are demanding new tools in order to hedge their risk. As a response to this demand, shipping derivatives have been successfully introduced in 2001, leading to a market of 50bn USD in 2006 (source HSBC Shipping Services). Due to this large volatility, only models with jumps, such as Lévy processes, are realistic. The work by Bakshi & Madan (2000) provides a basis to price such derivatives either by way of Maximum Likelihood (MLE) or Minimum Mean Square Error (MMSE) directly using the characteristic function. After an introduction to the shipping market, a real-time application of this technology will be presented such as vessel revenue valuation and shipping derivatives trading strategies.
Chris Rogers - "The Cost of Illiquidity and Its Effects on Hedging":
Illiquidity is an important effect in the markets, yet it is hard to come up with a good definition, which not only has some economic explanation but also retains a reasonable degree of tractability. In this paper, we propose a simple model based on consideration of the limit order book which results in a modification of the usual Black-Scholes dynamics of portfolio wealth. Working with a suitably simple objective, we are able to find a quite direct solution to the hedging problem that requires only the numerical solution of three BS-style PDES. (Joint work with Surbjeet Singh)
Silvia Romagnoli - "The Dependence Structure of Running Maxima and Minima: Results and Option Pricing Applications":
We provide general results for the dependence structure of running maxima (minima) of sets of variables. Using copula functions, we derive recursive formulas for running minima and maxima. These formulas enable us to use a "bootstrap" technique to estimate the difference between the pricing kernels of European options and barrier options on a grid of dates. We also show that the dependence formulas for running maxima (minima) are completely defined from the copula function representing dependence among levels at the terminal date. The result is useful to provide pricing applications for derivatives and structured products based on multivariate running maxima (minima). Altiplanos with pay-offs determined on a discretely monitored barrier are evaluated using the dependence structure of the corresponding European products. The difference in price is simply the volume of this copula between the coordinates of the European prices and those of the running maxima (minima). (Joint work with Umberto Cherubini.)
Birgit Rudloff - "Hedging in Incomplete Markets with Convex Risk Measures":
In incomplete financial markets not every contingent claim can be replicated by a self-financing strategy. Starting with an amount of money smaller than the superhedging price of the claim, we want to find a strategy that minimizes the risk of the shortfall. We use a convex risk measure. This problem can be split into a static optimization problem and a representation problem. We show that the optimal strategy consists in superhedging a modified claim whose payoff is the product of the solution of the static problem and the original payoff. to solve the static problem we apply convex duality methods. We provide necessary and sufficient optimality conditions.
Wolfgang Runggaldier - "Contagious Default: Application of Methods of Statistical Mechanics in Finance":
Default of a firm is in general contagious (infectious). Taking contagion into account is therefore important for an institution holding a large credit portfolio. We approach the study of contagion by using interacting particle methods. In particular, we study limit distributions when the number of firms goes to infinity as well as their approximations when the number of firms is finite but large. This allows to explain various phenomena like default clustering and, in general, it allows to view a credit crisis as a microeconomic phenomenon driven by endogenous financial indicators. Finally, we apply the results to large portfolio losses. (Based on joint work with P. Dai Pra, E. Sartori, M. Tolotti).
Sotirios Sabanis - "A Note on the Q-Optimal Martingale Measure":
An important and challenging problem in mathematical finance is how to choose a pricing measure in an incomplete market, i.e. how to find a probability measure under which expected payoffs are calculated and fair option prices are derived under some notion of optimality. In an incomplete market, the choice of the equivalent martingale measure (EMM) for the underlying price process is not unique. Over the last twenty years, many authors have proposed different preference based criteria in order to choose a `suitable' pricing measure from the class of EMMs. Two of the most popular choices are the minimal entropy EMM, see for example Frittelli (2000), and the variance optimal EMM, see Delbaen & Schachermayer (1996). Recently, Hobson (2004) proposed a unifying framework called the q-optimal measure, for a wide range of EMMs choices, that includes the two aforementioned measures. The notion of q-optimality is linked to the unique EMM with minimal q-moment (if q > 1) or minimal relative entropy (if q=1). Hobson's (2004) approach to identifying the q-optimal measure (through a so-called fundamental equation) suggests a relaxation of an essential condition appearing in Delbaen & Schachermayer (1996). This condition states that for the case q=2, the Radon-Nikodym process, whose last element is the density of the candidate measure, is a uniformly integrable martingale with respect to any EMM with a bounded second moment. Hobson (2004) alleges that it suffices to show that the above is true only with respect to the candidate measure itself and extrapolates for the case q>1. Cerny & Kallsen (2006) however presented a counterexample (for q=2) which demonstrates that the above relaxation does not hold in general. The author will present the general form of the q-optimal measure following the approach of Delbaen & Schachermayer (1994) and prove its existence under mild conditions. Moreover, in the light of the counterexample in Cerny & Kallsen (2006) concerning Hobson's (2004) approach, necessary and sufficient conditions will be presented in order to determine when a candidate measure is the q-optimal measure.
Jörn Sass - "The Numeraire Portfolio Under Transaction Costs":
We study the existence of a numeraire portfolio for a discrete time financial market with proportional transaction costs. In an incomplete market without frictions, consistent prices for derivative securities can be obtained by taking the expectation of the claim with respect to a certain probability measure under which the discounted asset prices become martingales. The numeraire portfolio allows to replace this change of measure by a change of numeraire. For models with transaction costs, the concept of a martingale measure and thus the concept of a numeraire portfolio have to be modified. Without transaction costs a well known approach is to find the growth optimal portfolio (but the numeraire portfolio might not exist). with some modifications and under reasonable conditions the same approach turns out to work for our model. (Joint work with Manfred Schael, Bonn)
Peter Schaller - "Consistent Incorporation of Statistical Uncertainties Into Quantile Estimates":
Besides giving an introduction into the subject, which practically arose from the problem of model risk in the context of Value at Risk estimates for financial portfolios and led to an elegant theoretical solution, the talk will also provide results not yet published in the literature.
Christian Schmidt - "Outperforming Benchmarks in Fixed-Income Markets":
We consider a fixed-income market where the market dynamics are driven by a state-process
and where the market-price of risk is allowed to be stochastic. In this market, a benchmark
process is defined by means of a trading strategy in fixed income securities. The investor is
interested in maximising expected utility of terminal wealth relative to this benchmark.
We propose to decompose the trading strategy into two parts: the benchmark strategy which
provides the baseline and an active strategy which considers deviations relative to the
benchmark and which will lead to the overall optimal strategy for the optimization problem.
This decomposition allows for a nice interpretation and even simplifies computation in some
cases.
The optimal portfolio strategy is derived from a Hamilton-Jacobi-Bellman equation. For an
affine market-model, we provide analytical results (up to ODEs) and analyse the influence of
the benchmark on the optimal strategy.
We conclude with a discussion of possible extensions towards regime-switching models.
Michael Schmutz - POSTER "Zonoid Options":
Following Koshevoy and Mosler, probability measures with finite absolute first moment can be uniquely represented by its lift zonoid, a representative of convex bodies, given by taking the Aumann expectation of a certain random line segment. a nonempty convex body is uniquely determined by its support function. It turns out that, up to interests, the arbitrage free values of European call and put options are given by the support function of the lift zonoid of the equivalent martingale measure at certain points. This yields various geometrical interpretations of the behavior of these instruments. The value of other derivatives can be interpreted as the mixed volume of the above lift zonoid and certain related convex bodies. This should lead to a more intuitive and deeper understanding of some aspects of derivative theory and pave the way for potential later developments of concrete applications.
Ilse Schoeman - "Modeling of the Bank’s Profitability via a Lévy Process-Driven Model and a Black-Scholes Model":
We model the profitability of banks in a stochastic manner by means of a Lévy process-driven model(Heston’s model) and a Black-Scholes model (Euler-Maruyama method). In this regard, we highlight two measures of bank profitability, viz., the return on assets (ROA; measure of the operational efficiency of the bank) and the return on equity (ROE; measure of the owner's returns on their investment). Banks manage the amount of capital they hold to prevent bank failure and to meet bank capital requirements set by the regulatory authorities. However, they do not want to hold too much capital because by doing so they will lower the returns to equity holders. In order to accomplish this, we derive stochastic differential equations that provide information about the dynamics of the value processes of net profit after tax, equity capital and total assets. We also provide appropriate numerical examples and simulations of the ROE and the ROA.
Torsten Schöneborn - "Dynamic Optimal Execution Strategies and Predatory Trading":
We consider a risk-averse investor holding a large asset position in an illiquid market. When selling this position, the investor faces a trade-off between high costs of quick liquidation and high uncertainty of slow liquidation. Most previous research on optimal liquidation strategies focused on static strategies and mean-variance risk-averse investors. We generalize these investigations to include dynamic liquidation strategies and investors seeking von Neumann-Morgenstern utility maximization. In this setting, we find that investors with constant absolute risk aversion will pursue static strategies, while investors with increasing or decreasing absolute risk aversion will adjust their trading speed to exogenous changes in market prices. Furthermore, we extend the analysis to the case where competing market participants are aware of the investor’s trading intentions. We show that the liquidity characteristics and the number of competitors in the market determine the optimal strategy for the competitors: they either provide liquidity to the seller, or they prey on him by simultaneous selling. If they provide liquidity, it might be sensible for the seller to pre-announce a trading intention ("sunshine trading"). (Joint work with Alexander Schied)
Antje Schulz - "Optimal Execution Strategies in Limit Order Books with a General Shape Function":
Following Obizhaeva and Wang (2005), we consider optimal execution strategies for block market orders placed in a limit order book (LOB). in this note, we allow for a general shape of the LOB defined via a given density function. In this setting, there are now two possibilities of modeling the resilience of the LOB after a large market order: the exponential recovery of the number of limit orders, i.e., of the volume of the LOB, or the exponential recovery of the bid-ask spread. We consider both situations and, in each case, derive optimal execution strategies in discrete time. (Joint work with Alexander Schied and Aurélien Alfonsi)
Christoph Schwab - "Numerical Derivative Pricing in Non-BS Markets":
We report on deterministic solution methods for Kolmogoroff equations.
Admissible processes are strong Markov processes, possibly nonstationary, of jump-diffusion and pure jump type, including in particular Lévy and additive processes. Multivariate models with copula models for the dependence in the marginals' jump structure are allowed.
Our approach is based on stabilized Galerkin discretization of the process' infinitesimal generator resp. its Dirichlet form in a wavelet basis. The methods allow to analyze single period and multiperiod contracts of European, American or exotic style, in single or multiple periods and on single underlyings or on baskets in a unified fashion.
We address the superconvergent extraction of Greeks and model calibration, validation and verification. Numerical analysis in the domains of Dirichlet forms of the price processes is briefly addressed. Examples include American and exotic contracts on Lévy copula dependence models, single or multiscale stochastic volatility models of BNS and coGARCH type.
Joint work of the CMQF group in the Seminar for Applied Mathematics, ETH Zurich.
Pauline Sculli - "Contagion in Affine Default Processes":
We present a new framework for the construction of contagion in reduced-form models of credit risk, originating from piecewise deterministic Markov process theory, which allows the credit dynamics of a large number of firms to be looped together in a mathematically tractable way. Furthermore, rather than working in the classical “single-default” framework, we model credit event arrival processes, which are often of greater contractual interest. We let the number of credit events occurring for each firm be a Poisson counting process with Lévy intensity dynamics characterised by two classes of jumps, the origins of which can be self-infecting or contagious. Self-infectious shocks arise in reaction to a firm’s own credit events, whereas contagious jump shocks arise in reaction to counterparty credit events. Alongside, we can also allow for background jump shocks that arrive but are not, ex-post, drivers of credit events. We present an exponential affine martingale which facilitates the analytical construction of survival probabilities via the probability generating function and the construction also of intensity moments, which we find as functions of the Laplace transforms of contagion distributions.
Martin Schweizer - "Modelling Option Prices":
In this talk, our goal is to construct joint models for underlyings and options written on these. More precisely, we want to specify a volatility structure for both assets and options and to construct from that an arbitrage-free model. It turns out that this is surprisingly tricky and the feasibility of the construction (i.e., the proof of existence) hinges quite a bit on the choice of a suitable parametrization. We shall provide positive results in some classes of examples and also mention open problems. This is joint work with Johannes Wissel (ETH Zurich).
Carlo Sgarra - POSTER "A Finite Element Framework for Option Pricing with the Bates Model":
In the present paper we present a finite element approach for option pricing in the framework of a well-known stochastic volatility model with jumps, the Bates model. In this model the asset log-returns are assumed to follow a jump-diffusion model where the jump component consists of a Lévy process of compound Poisson type, while the volatility behavior is described by a stochastic differential equation of CIR type, with a mean-reverting drift term and a diffusion component correlated with that of the log-returns. Like in all the Lévy models, the option pricing problem can be formulated in terms of an integro-differential equation: for the Bates model the unknown f(S, V, t) (the option price) of the pricing equation depends on three independent variables and the differential operator part turns out to be of parabolic kind, while the nonlocal integral operator is calculated with respect to the Lévy measure of the jumps. In this paper we will present the variational formulation of the problem together with a detailed discussion of the boundary conditions, and a suitable choice of the finite-dimensional space onto which the variational problem is projected. The results for European Call option will be obtained as a benchmark together with a calibration procedure. Some preliminary results for American option pricing will be discussed.
Jan Sindelar - "Adaptive Control Applied to Financial Market Data":
Our research aim is to plan an optimal decision strategy in trading commodity futures markets. At a given time, we have to decide to buy or sell a commodity contract or stay out of the market. The decision is made using dynamic programming using many different quantities - previous price maxima and minima or variance, commitment of traders information or own engineered quantities taken out of trading experience. As a loss function we take the negative profit measured in money, where the probability density functions(PDF) are estimated using Bayesian learning. For computational solvability, we need to implement a series of approximations: predictive PDFs are computed using parametric models from an exponential family, giving us easy to adapt systems. We use point estimates to overcome the curse of dimensionality and we are trying to lower the number of dimensions (main components analysis etc.). We are trying to overcome dynamic evolution of PDFs by employing forgetting older data. Trading costs (slippage and commission) are taken into account. The theory is supported by a series of experiments indicating our ability to construct a profitable trading machine. The research is conducted in cooperation with industry (Colosseum corporation).
Irina Slinko - "Approximation of Good Deal Bound Solutions":
The paper shows how to find approximate “good deal” bounds for European claims in incomplete markets. We consider incomplete markets where the incompleteness is caused by presence of jumps or a non-traded factor. The “good deal” bound solutions were first introduced by [5], who suggested to rule out not only prices which create arbitrage opportunities but also price processes with “too high” Sharpe ratios. Imposing a uniform bound B on Sharpe ratios of all the derivatives and portfolios in the market, they find highest and lowest prices subject to the imposed constraints. The theory was extended by [2] on the models where the incompleteness is caused by jumps in the underlying asset’s price process. The bounds are shown to be the solutions of the appropriate stochastic optimal control problems. In a general case, good deal bounds cannot be computed explicitly, which enables us to use numerical finite-difference methods. The procedure would require even more computational time if we would like to compute solutions for several values of the bound B. Thus, to simplify the numerical procedure, we find a linear approximation of the good deal bound price, writing Taylor expansion of the upper (lower) good deal bound price around the price given by the minimal martingale measure (MMM). We expand the good deal prices in the new variable y, which is defined as a square root function of the good deal bound B and some parameters of the model. The MM measure provides us with a canonical benchmark for pricing any derivative, it has simpler structure than good deal bound prices and is much easier to compute. In order to compute the approximated bounds we find PDEs to which the MMM price and the sensitivity of the option prices with respect to the new parameter y (evaluated at the MMM solution) satisfy. We show that the linear approximation works extremely well for the small deviations of the bound value from bound value which corresponds to the MMM solution.
Mete Soner - "Second Order BSDE's: New Results on Existence":
The theory of BSDE's have been well developed and had many applications in several fields as well as in finance. In joint work with Cheredito, Touzi and Victoir, we have extended the theory to include "second order" terms. In this talk, I will outline the general theory. Then, I will describe a new weak approach to the question of existence and some results on the numerical solutions.
Ghulam Sorwar - "Valuation of Two-Factor Interest Rate Contingent Claims Using Green’s Theorem":
Over the years a number of two-factor interest rate models have been proposed that have formed the basis for the valuation of interest rate contingent claims. This valuation equation often takes the form of a partial differential equation, that is solved using the finite difference approach. in the case of two factor models this has resulted in solving two second order partial derivatives leading to boundary errors, as well as numerous first order derivatives. in this paper we demonstrate that using Green’s theorem second order derivatives can be reduced to first order derivatives, that can be easily discretised; consequently two factor partial differential equations are easier to discretise than one factor partial differential equations. We illustrate our approach by applying it to value contingent claims based on the two factor CIR model. We provide numerical examples which illustrates that our approach shows excellent agreement with analytical prices and the popular Crank Nicolson method.
Tommi Sottinen - "Local Continuity of Stopping Times and Arbitrage":
In a recent work [Bender, C., Sottinen, T. And Valkeila, E. (2006) No-arbitrage pricing beyond semimartingales. WIAS Preprint No. 1110] we considered non-semimartingale pricing models that have non-trivial quadratic variation and a certain "small-ball property". It turned out that in these models one cannot do arbitrage with strategies that are continuous in terms of the spot and some other economic factors such as the running minimum and maximum of the stock. Unfortunately, this result does not extend to even simple strategies, when stopping times are involved. The reason is obvious: Stopping times are typically not continuous in the stock price. in this talk we introduce some rather weak continuity-like properties that allow us to extend the no-arbitrage results of ibid. to strategies that involve stopping. the talk is based on an ongoing joint work with C. Bender, D. Gasbarra, and E. Valkeila.
Thomas Steiner - POSTER "Yield Curve Shapes in Affine One-Factor Models":
We consider a model for interest rates, where the short rate is
given by a time-homogenous, one-dimensional affine process in the
sense of Duffie, Filipovic and Schachermayer. We show that in such a model
yield curves can only be normal, inverse or humped (i.e. endowed
with a single local maximum). Each case can be characterized by
simple conditions on the present short rate $r_t$. We give
conditions under which the short rate process will converge to a
limit distribution and describe the limit distribution in terms of
its cumulant generating function. We apply our results to the
Vasicek model, the CIR model, a CIR model with added jumps and
a model of Ornstein-Uhlenbeck type. (Joint work with Martin Keller-Ressel)
Robert Stelzer - "Multivariate Continuous Time Lévy-Driven GARCH Processes":
A multivariate extension of the COGARCH(1,1) process introduced in Klüppelberg, Lindner and Maller [J. Appl. Probab. 41 (2004), 601-622] is presented and shown to be well-defined. The definition generalizes the idea of Brockwell, Chadraa and Lindner [Ann. Appl. Probab. 16(2006), 790-826] for the definition of the univariate COGARCH(p,q) process and is in a natural way related to multivariate discrete time GARCH processes as well as positive-definite Ornstein-Uhlenbeck type processes.
Furthermore, we establish important Markovian properties and sufficient conditions for the existence of a stationary distribution for the volatility process, which lives in the positive semi-definite matrices, by bounding it by a univariate COGARCH(1,1) process in a special norm. Moreover, criteria ensuring the finiteness of moments of both the multivariate COGARCH process as well as its volatility process are given. Under certain assumptions on the moments of the driving Lévy process, explicit expressions for the first and second order moments and (asymptotic) second order stationarity are obtained.
As a necessary prerequisite we study the existence of solutions and some other properties of stochastic differential equations being only defined on a subset of $\mathbb{R}^d$ and satisfying only local Lipschitz conditions.
Finally, we present some illustrative examples and simulations.
Łukasz Stettner - "Portfolio Selection with Transaction Costs, Decision Lag and Execution Delay":
We consider portfolio selection problem in the case when asset prices depend on economic factors and the pair assets plus factors form a Feller Markov process taking values on a locally compact separable metric space. We make decisions concerning our portfolio in the moments which are separated by a deterministic time lag and our decisions are executed with delay. Both time lag and execution delay are different. The problem of maximization of the portfolio over a finite time horizon can be transformed into an impulse control which leads to a sequence of stopping problems with discontinuous functionals. Continuity of the solution to the corresponding Bellman equation (solution to the quasi-variational inequality) is shown and nearly optimal strategies are constructed. The result generalizes a paper by B. Bruder and H. Pham and written jointly with Dr. J. Palczewski. Further generalizations and other aspects of the problem will be also presented.
Kai Tappe - POSTER "Launching a Cannon At Multivariate Lévy Processes":
Lévy copulas opened as the generic concept to describe dependence structures of multidimensional Lévy processes. In this paper we contribute an inverse approach for parsimonious copula modelling. Rather than defining a copula directly we construct the association between components in an implicit manner by variate conditioning. This pattern renders natural the simulation of multidimensional Lévy processes by series representation. We quantify graphically the effect of a conditional structure definition using simplified sample algorithms for path generation.
Stefan Tappe - "Existence of Lévy Term Structure Models and Finite Dimensional Realizations":
Lévy driven term structure models have become an important subject in the mathematical finance literature. From a financial modelling point of view one would consider the volatility of such a Heath, Jarrow and Morton (HJM) term structure model to be a function of the prevailing forward curve. As wee shall see, this makes the forward rates being a so-called mild solution of an infinite dimensional stochastic differential equation in some suitable Hilbert space of forward curves. This is a key observation for our first goal of the talk, namely to provide the existence of a solution for the HJM equation in the Lévy case. For this purpose, we give some existence results for Lévy driven stochastic differential equations with Lipschitz continuous parameters in a general Hilbert space. In order to apply these results to HJM equations, we have to find an appropriate Hilbert space of forward curves. There are several reasons why, in practice, one is interested in such HJM models which admit a finite dimensional realization (FDR), that is the forward rate process proceeds on a finite dimensional submanifold of the Hilbert space of forward curves. So, our second goal is to characterize, in terms of the forward rate volatilities, those HJM models possessing an FDR. Using ideas from differential geometry, this is achieved by translating the FDR problem into a deterministic problem related to the volatility structure.
Gregory Temnov - POSTER "Fourier Transform as an Efficient Methodology for Loss Aggregation":
The problem of risk aggregation, the key point of which is the summation of random number of random variables, is an important task in the management of practically all kinds of risk. We investigate an efficient approach to this problem in the frame of operational risk management. Our approach is based on the Fast Fourier transform (FFT) methodology. The basic problem to manage while using FFT is the reduction of the aliasing error. A natural tool to be applied against the aliasing error is to compose the initial function of the probability distribution of a single loss with the exponential function (say, to apply the exponential window). Using this approach in application to operational risk data, we investigate its efficiency and compare it to different commonly used techniques. All basic kinds of numerical errors in the algorithm are analyzed and the adjustment of parameters is made to establish the efficient and precise scheme of the aggregate loss calculation.
Stefan Thonhauser - "Optimal Dividend Strategies for a Risk Process Under Force of Interest":
In the classical Cramer-Lundberg risk model the problem of maximizing the expected cumulated discounted dividends is a widely discussed topic. In the most general case within this framework it is proved (Gerber 1969, Azcue and Muler 2005, Schmidli 2006) that the optimal dividend strategy is of the not very intuitive band strategy type. We discuss this maximization problem in a modified setting including a constant force of interest in the risk model. The value function can be identified in the set of viscosity solutions of the associated HJB equation and the optimal dividend strategy in this risk model with interest can be derived. (Joint work with Hansjoerg Albrecher, TU Graz)
Mikhail Urusov - "Stopping of Integral Functionals of Diffusions and 'No-Loss' Free Boundary Formulation":
This talk is based on joint works with L. Rueschendorf and D. Belomestny. We consider optimal stopping of integral functionals of a one-dimensional diffusion, the coefficients of which are allowed to be discontinuous. Therefore the standard formulation of the free boundary does not work. We provide an interesting modification of the standard form of the free boundary problem that works well here: loosely speaking, the modified free boundary problem has a solution if and only if the stopping problem has an optimal stopping time, and in this case, the solution of the free boundary is unique and provides the solution of the stopping problem.
Esko Valkeila - "Approximation of Geometric Fractional Brownian Motion":
We give an approximation to geometric fractional Brownian motion in the sense of weak convergence, in the case when the self-similarity index of driving fractional Brownian motion is bigger than one half. Our approximation has the advantage to the previous ones in that the associated pricing model is free of arbitrage opportunities and complete.
Cathrin van Emmerich - POSTER "A Square Root Process for Modelling Correlation As a Stochastic Process":
Many models require to specify a correlation coefficient for mapping the dependence between different assets. Historical data show that it is not constant. The market acknowledges this observation by offering products which make the correlation risk tradable, for example the correlation swap. For pricing these products it is necessary to develop appropriate models for correlation. One possibility is a mean-reversion process with a square root diffusion function. Similar to the square root in the CIR process, it guarantees that the process preserves the particular boundaries, which is important in modelling correlation. Analytical results on boundary behaviour, stationary distribution and moments are presented.
Nele Vandaele - "Hedging Unit-Linked Life Insurance Contracts Driven by a Lévy Process":
The aim of this paper is twofold. Firstly, we determine the locally risk minimizing hedging strategy for a pure endowment contract when the risky asset follows a Lévy semimartingale process. In case the risky asset is one-dimensional and continuous, Schweizer found an easy way to determine the locally risk minimizing hedging strategy using the Galtchouk-Kunita-Watanabe decomposition under the minimal martingale measure. Due to the discontinuity of a Lévy process, we had to determine the locally risk-minimizing hedging strategy in a different way.
Secondly, we show how to hedge a surrender option, when the risky asset follows a Lévy martingale process. In this case we get an extra risk term, in addition to the financial risk. In order to hedge this additional risk, we use some concepts from credit risk, like the H-hypothesis and the F-independency.
Michèle Vanmaele- "Comonotonicity Applied in Finance":
In finance very often one has to deal with problems where multivariate random variables are involved, e.g. basket options where the price depends on several underlying securities. Asian options or Asian basket options are other examples of options where the price depends on a weighted sum of non-independent asset prices.
One can construct upper and lower bounds for the prices of such types of European call and put options based on the theory of stochastic orders and of comonotonic risks. Comonotonicity essentially reduces a multivariate problem to a univariate one, leaving the marginal distributions intact.
One can model the dynamics of the underlying asset prices, e.g. use a Black-Scholes model; but it is also possible to develop model-free bounds expressed in terms of in the market observed option prices on the individual underlying assets. Moreover the comonotonic upper bound can be interpreted as a superreplicating strategy.
Instead of deriving bounds one can look at approximations, e.g. Monte Carlo (MC) simulation is a technique that provides approximate solutions to a broad range of mathematical problems. A drawback of the method is its high computational cost, especially in a high-dimensional setting. Therefore variance reduction techniques were developed to increase the precision and reduce the computer time. the so-called Comonotonic Monte Carlo simulation uses the comonotonic approximation as a control variate to get more accurate estimates and hence a less time-consuming simulation.
We will introduce the notion of comonotonicity and discuss the different approaches based on the theory of comonotonicity. The methods will be applied to examples in finance.
Michel Vellekoop - "Optimal Consumption and Investment of Randomly Terminating Income":
We investigate an optimal consumption and investment problem where we receive a certain fixed income stream that is terminated at a random time. Dual methods are used to reduce the problem to a deterministic optimal control problem that can be solved explicitly. We show that the value function for this problem and the corresponding optimal consumption and investment strategies differ considerably from the case where an income stream is certain to continue indefinitely.
Michel Verschuere - "Hedging Under Uncertainty: Applications to Carbon Emissions Markets":
We discuss a model for a market with two tradeable asset where the price of the first asset depends on the price of the second asset and the value of an additional source of uncertainty that can not be traded. the source of uncertainty intervenes in the drift of the dynamics defining the price of the second asset. We apply stochastic filtering theory to price the first asset in terms of the second one under the augmented filtration for price and drift component. We illustrate how our model can be applied to markets for EU carbon emission allowances and calculate the value of a digital option on the event that the ETS zone falls short allowances at the end of a trading phase. (Joint work with Umut Cetin of London School of Economics)
Tanja Veza - "The Economic Role of Jumps and Recovery Rates in the Market for Corporate Default Risk":
Using an extensive cross section of US corporate CDS panels this paper offers an economic understanding of their time-series behavior, implied loss given default, as well as risk premia attached to the risk of sudden jumps in CDS spreads. We take a parametric approach with an affine multi-factor reduced-form model accommodating jumps in both riskless and defaultable factors. Jumps improve the model's capability to capture empirical properties specific to CDS premia. CDS written on obligors in industries with long investment cycles and long-term financing exhibit significantly less frequent jumps. The probability of structural migration to default is considered so low for investment-grade obligors that investors fear distress only through rare, but devastating events. Similarly, investors assign a low probability of structural default to firms with Financials and Utilities sector affiliation. High correlation of default processes with the VIX index indicates a strong relation of corporate CDS premia to equity. Implied LGD is well identified and compares to historically realized values. Obligors with substantial tangible assets are expected to recover more in default. A clear-cut distinction in the level of LGD shows between investment-grade and speculative-grade issuers. Thus, industry practice of assuming equal LGD across ratings and sectors is not compatible with market data. Using our cross section of implied LGD we provide figures which are, on average, consistent with rating and industry.
Richard Vierthauer - "On Utility Indifference Pricing in Affine Stochastic Volatility Models":
We follow an idea of Mania and Schweizer and consider exponential utility indifference pricing and hedging in linear approximation. In this approximation the problem reduces to solving the pure utility maximization problem under exponential utility and to determining the Galtchouk-Kunita-Watanabe decomposition of the claim under the minimal entropy martingale measure. These objects are obtained semi-explicitly in affine stochastic volatility models with or without jumps. We illustrate our results by a numerical example in the Lévy-driven stochastic volatility model proposed by Barndorff-Nielsen and Shephard. the presentation is based on joint work with Jan Kallsen and Thorsten Rheinländer.
Ralf Werner - "Consistency of Robustified Portfolio Optimization Frameworks":
In recent years, several alternatives to the traditional Markowitz portfolio optimization framework gained more and more popularity. Among those, the most prominent ones are probably Michaud's resampling approach and the robust counterpart ansatz. For clarity, the presentation will focus on the general mathematical framework for Michaud's setup, which includes traditional Markowitz portfolio optimization as a special case. As main result we will show that this approach is consistent in a statistical sense, i.e. The estimated portfolios converge to the true optimal portfolios if consistent point estimators are used for input data estimation. As the proof relies on continuity properties of the solution of parametric convex conic optimization problems, it allows for very general portfolio constraints. These novel findings do not only provide a justification for Michaud's approach but also clearly extend existing results for the traditional Markowitz model. Finally, we will sketch analogous results for the robust counterpart ansatz. (Joint work with Katrin Schoettle, TU Muenchen)
Anke Wiese - "Numerical Solution of Stochastic Differential Equations Evolving on Manifolds":
We present numerical schemes for nonlinear stochastic differential equations whose solution evolves on a smooth finite dimensional manifold. Given a Lie group action that generates transport along the manifold, we pull back the stochastic flow on the manifold to the Lie group via the action and subsequently to the corresponding Lie algebra. We construct an approximation to the stochastic flow in the Lie algebra via closed operations and then push back to the manifold, thus ensuring our approximation lies in the manifold. We call such schemes stochastic Munthe-Kaas methods after their deterministic counterparts. We also present stochastic Lie group integration schemes based on Castell-Gaines methods. They become stochastic Lie group integrator schemes if we use Munthe-Kaas methods as the underlying ordinary differential integrator. Further, we show that some Castell--Gaines methods are uniformly more accurate than the corresponding stochastic Taylor schemes. Lastly we demonstrate our methods in some examples including a forward stochastic Riccati system that arises in backward form in linear-quadratic control problems such as the mean-variance hedging problem in incomplete financial markets.
Ralf Wunderlich - "Computing Optimal Investment Strategies Under Partial Information and Bounded Shortfall Risk":
We consider a time-continuous financial market and a dynamic portfolio optimization problem where the expected utility from terminal wealth has to be maximized. The special features of this paper are an additional shortfall constraint on the terminal wealth and a financial market with partial information. The shortfall risk is measured in terms of expected loss. Stock prices are assumed to satisfy a stochastic differential equation with a drift parameter modeled as an unobservable continuous-time, finite state Markov chain (HMM). Combining martingale and convex duality methods we find the form of the optimal terminal wealth. For the optimal trading strategies explicit formulas are given by using Malliavin calculus. Numerical examples illustrate the analytic results. (Joint work with Jörn Sass from RICAM in Linz)
Uwe Wystup - "About the Price of a Guarantee -
A Statistical Evaluation of Returns of long-term Investments":
Retail investors saving for their retirement are currently offered several funds with guarantee features in the German market. Either the capital or parts of it or a minimal annual return are different forms of guarantees. Investment opportunities are either "certificates'' issued by banks or "funds". Often investors are afraid of market crashs or an asset melt-down, the need for products with guarantee is increasing. In this paper we analyze the current market situation, discuss the most widely traded products, identify the classic types of guarantee products -- "discount'', "bonus'', "performance'' and "CPPI'' and simulate their returns in a jump diffusion model over a 25 year time horizon. As a result we find on the one hand products with guarantee attractive as they often guarantee a return only slightly below the risk-free rate, but end up with an average return of about half the benchmark index. on the other hand they perform really poor compared to some actively managed funds without guarantee, even taking into account the management fees.
Thaleia Zariphopoulou - "Investment Performance Measurement, Risk Tolerance and Optimal Portfolio Choice":
A new approach to measure the dynamic performance of investment strategies is introduced. to this aim, a family of stochastic processes defined on [0, \infty) and indexed by a wealth argument is used. Optimality is associated with their martingale property along the optimal wealth trajectory. The optimal portfolios are constructed via stochastic feedback controls that are functionally related to differential constraints of fast diffusion type. A multi-asset Ito type incomplete model is used.
Mihail Zervos - "A Model for Reversible Investment Capacity Expansion":
We consider the problem of determining the optimal investment level that a firm should maintain in the presence of random price and/or demand fluctuations. We model market uncertainty by means of a geometric Brownian motion, and we consider general running payoff functions. Our model allows for capacity expansion as well as for capacity reduction, with each of these actions being associated with proportional costs. The resulting optimisation problem takes the form of a singular stochastic control problem that we solve explicitly. We illustrate our results by means of the so-called Cobb-Douglas production function. The problem that we study presents a model, the associated Hamilton-Jacobi-Bellman equation of which admits a classical solution that conforms with the underlying economic intuition but does not necessarily identify with the corresponding value function, which may be identically equal to infinity. Thus, our model provides a situation that highlights the need for rigorous mathematical analysis when addressing stochastic optimisation applications in finance and economics, as well as in other fields.
Jakub Zwierz - "On Insiders Who Can Stop At Honest Times":
We consider a market with two types of agents possessing different levels of information. In addition to the regular agent, there is an insider whose additional knowledge consists of being able to stop at an honest time $\Lambda$. We show, using the multiplicative decomposition of Azéma supermartingale, that if the martingale part of the price process has the predictable representation property and $\Lambda$ satisfies some mild assumptions, then there is no equivalent local martingale measure for insider. This extends the results obtained by P. Imkeller to the continuous semimartingale setting and general honest times.
|