# Schedule of the Workshop "Stochastic Optimization - Models and Algorithms"

## Monday, May 27

## Tuesday, May 28

## Wednesday, May 29

09:15 - 09:50 |
Alejandro Jofré (University of Chile): Robust stability of economic equilibrium |

09:50 - 10:25 |
Terry Rockafellar (University of Washington and University of Florida): Economic equilibrium with incomplete financial markets, revisited |

10:30 - 11:00 |
Coffee break |

11:00 - 11:35 |
Igor Evstigneev (University of Manchester): Mathematical behavioral finance |

11:35 - 12:10 |
Sjur Flåm (University of Bergen): Reaching market equilibrium by bilateral barters |

12:10 - 12:45 |
Georg Pflug (University of Vienna): Time consistency in stochastic optimization |

12:45 - 14:45 |
Lunch break |

14:45 - 15:20 |
Bertrand Villeneuve (Université Paris-Dauphine): Commodity storage with durable shocks: a simple Markovian model |

15:20 - 15:55 |
Ivar Ekeland (Université Paris-Dauphine): Spot prices and future prices for commodities: an equilibrium model |

16:00 - 16:30 |
Coffee break |

16:30 - 17:05 |
Roger Wets (University of California-Davis): Modeling and estimating commodity prices: copper prices |

## Abstracts:

Michel De Lara: Smart power systems, renewable energies and markets: the optimization challenge

Abstract: Electrical power systems are undergoing a deep and fast transformation, fueled by renewable energies penetration, telecommunication technologies and markets expansion. We discuss to what extent optimization is challenged. We shed light on the two main new issues in stochastic control in comparison with deterministic control: risk attitudes and online information. We cast a glow on two snapshots highlighting ongoing research in the field of stochastic control applied to energy:

* Decomposition-coordination optimization methods under uncertainty, with an illustration on multiple dams management;

* Risk constraints in optimization, with the example of a dam management under a tourist chance constraint.

Teemu Penannen: Convex duality in stochastic optimization and finance

Roger Wets: Computing equilibrium in a stochastic environment

Abstract: We described a method to solve deterministic and stochastic Walras equilibrium models based on associating with the given problem a bifunction whose maxinf-points turn out to be equilbrium points. The numerical procedure relies on an augmentation of this bifunction. In the dynamic versions of our models we are mostly concerned with models that equip the agents with a mechanism to transfer goods from one time period to the next, possibly simply savings, but also allows for the transformation of goods via production. The last section deals with a simpe instance when a financial market is operating.

Julio Backhoff: On portfolio delegation with moral hazard under translation invariance

Joaquin Fontbona: Robust portfolio optimization without model compactness

Mareen Benk: Intertemporal asset liability management with jumps

Abstract: I have developed an intertemporal portfolio choice model with jump risks. It can be applied to pension and life insurance funds, and private investors. Following the model of Rudolf and Ziemba (2004), these long-term investors aspire to “maximize the intertemporal expected utility of the surplus”, which is defined as “assets net of liabilities”. Return on liabilities are modelled by a typical pure-diffusion process. Return on assets are assumed to follow a jump-diffusion process with two jump components. More specifically, the first jump component represents a systemic risk according to Das and Uppal (2004) and the second jump component represents an idiosyncratic risk according to Jarrow and Rosenfeld (1984). An investor’s optimal portfolio consists of three funds: a market portfolio, a liability-hedging portfolio, and a riskless asset. In contrast to the results of Rudolf and Ziemba (2004), a market portfolio not only hedges diffusion risk, but it also hedges systemic risk and it takes into account idiosyncratic jump risk so that the investor is additionally protected against both a systemic risk and an idiosyncratic jump risk.

Woo Chang Kim: When is the 1/n strategy optimal?

William Ziemba: Response to Paul A. Samuelson letters and papers on the Kelly Capital Growth Investment Strategy

Abstract: The Kelly Capital Growth Investment Strategy (KCGIS) is to maximize the expected utility of final wealth with a logarithmic utility function. This approach dates to Bernoulli's 1738 suggestion of log as the utility function arguing that marginal utility was proportional to the reciprocal of current wealth. In 1956 Kelly showed that static expected log maximization yields the maximum asymptotic long run growth. Later, others added more good properties such as minimizing the time to large asymptotic goals, maximizing the median, and being ahead on average for the rst period. But there are bad properties as well such as extremely large bets for short term favorable investment situations because the Arrow-Pratt risk aversion index is near zero. Paul Samuelson was a critic of this approach and here we discuss his various points sent in letters to Ziemba and papers reprinted in the recent book, MacLean, Thorp and Ziemba (2011). Samuelson's opposition has prevented many finance academics and professionals from using Kelly strategies. For example, Ziemba was asked to explain this to Fidelity Investments, a major Boston investment rm close to and influenced by Samuelson at MIT. I agree that these points of Samuelson are correct and argue that they all make sense and caution users of this approach to be careful and understand the true characteristics of these investments including ways to lower the Investment exposure.

Leonard MacLean: Capital growth with security and drawdown penalties

Abstract: In capital growth under uncertainty, an investor must determine how much capital to invest in riskless and risky instruments at each point in time, with a focus on the trajectory of accumulated capital to a planning horizon. In this paper the traditional capital growth model and modifications to control risk are developed. A mixture model based on Markov transitions between normally distributed market regimes is used for the dynamics of asset prices. Decisions on investment in assets are based on a constrained growth model, where the trajectory of wealth is required to exceed a specified path over time with high probability, and the path violations are penalized using a convex loss function.

Mark Davis: Fractional Kelly strategies

Abstract: The Kelly criterion and fractional Kelly strategies hold an important place in investment management theory and practice. Both the Kelly criterion and fractional Kelly strategies, e.g. invest a fraction of one's wealth in the Kelly portfolio and a proportion in the risk-free asset, are optimal in the continuous time setting of the Merton [33] model. However, fractional Kelly strategies are no longer optimal when the basic assumptions of the Merton model, such as the lognormality of asset prices, are removed. In this chapter, we present an overview of some recent developments related to Kelly investment strategies in an incomplete market environment where asset prices are not lognormally distributed. We show how the definition of fractional Kelly strategies can be extended to guarantee optimality. The key idea is to get the definition of fractional Kelly strategies to coincide with the fund separation theorem related to the problem at hand. In these instances, fractional Kelly investment strategies appear as the natural solution for investors seeking to maximize the terminal power utility of their wealth.

Sébastien Lleo: Does the Bond-Stock Earning Yield Differential Model Predict Equity Market Corrections Better Than High PE Models?

Jan Palczewski: Theoretical and empirical estimates of mean-variance portfolio sensitivity

Abstract: This paper studies properties of an estimator of mean-variance portfolio weights in a market model with multiple risky assets and a riskless asset. Theoretical formulas for the mean square error are derived in the case when asset excess returns are multivariate normally distributed and serially independent. The sensitivity of the portfolio estimator to errors arising from the estimation of the covariance matrix and the mean vector is quantified. It turns out that the relative contribution of the covariance matrix error depends mainly on the Sharpe ratio of the market portfolio and the sampling frequency of historical data. Theoretical studies are complemented by an investigation of the distribution of portfolio estimator for empirical datasets. An appropriately crafted bootstrapping method is employed to compute the empirical mean square error.

Mikhail Zhitlukhin, William Ziemba: A disorder model and its use to determine exit and entry strategies in various financial bubble markets

Abstract: In this paper, the authors apply a continuous time stochastic process model developed by

Shiryaev and Zhitlukhin for optimal stopping of random price processes that appear to be bubbles. By a bubble we mean the rising price is largely based on the expectation of higher and higher future prices. Futures traders such as George Soros attempt to trade such markets. The idea is to exit near the peak from a starting long position. The model applies equally well on the short side, that is when to enter and exit a short position. In this paper we test the model in two technology markets. These include the price of Apple computer stock AAPL from various times in 2009{2012 after the local low of March 6, 2009; plus a market where it is known that the generally very successful bubble trader George Soros lost money by shorting the NASDAQ-100 stock index too soon in 2000. The Shiryaev-Zhitlukhin model provides good exit points in both situations that would have been protable to speculators following the model. who employed the model.

Igor Evstigneev: Mathematical behavioral finance

Abstract: The purpose of the talk is to present a new research area - Mathematical Behavioral Finance. Its characteristic feature is the systematic application of behavioral approaches combined with the mathematical modeling of financial markets. The focus of work is on the fundamental questions and problems pertaining to Finance and Financial Economics, especially those related to equilibrium asset pricing and portfolio selection. The models under study reflect the psychology of market participants and go beyond the traditional paradigm of fully rational utility maximization. They do not rely upon restrictive hypotheses (perfect foresight) and avoid using unobservable agents’ characteristics such as individual utilities and beliefs. The theory developed may be regarded as a plausible alternative to the classical general equilibrium theory (Walras, Arrow, Debreu, Radner, and others) responding to the challenges of today’s economic and financial reality.

Sjur Flåm: Reaching market equilibrium by bilateral barters

Bertrand Villeneuve: Commodity storage with durable shocks: A simple Markovian model

Abstract: We model an economy that alternates randomly between abundance and scarcity episodes. We characterize in detail the structure of the Markovian competitive equilibrium. Accumulation and drainage of stocks are the main focuses. Economically appealing comparative statics results are proved. We also characterize the stationary distribution of states. We extend the model to discuss price stabilization policies, injection and release costs, and limited storage capacity. Overall, the analysis delineates the notion of “flexible economy.”

Ivar Ekeland: Spot prices and future prices for commodities: an equilibrium model

Roger Wets: Modeling and estimating commodity prices: copper prices

Abstract: Many optimization problems consider copper prices as an important input. Nevertheless, prices are highly volatile and no one can predict them, but they can be estimated and scenario trees can be constructed to handle their stochasticity. In this talk we present a new methodology for modeling and estimating copper prices, based on two novel approaches. The first one is to make a distinction between short and long term (also known as the transient and the stationary processes), because the evidence shows that in the short term prices are highly volatile around a drift term, whereas in the long term prices are mean-reverting as the microeconomic theory suggests. The second approach considers the inclusion of market information to complement the historical data in the estimation of the drift term for the short term model. The use of market prices should be very important because it contains all the information available in markets at this time (stocks, expectations, etc), so its inclusion should help to capture the real drift term for our transient process. Finally, our model also takes into account inflation which leads us to a multi-dimensional (nonlinear) system for which we can generate explicit solutions.