Skip to main content
Open Access Publications from the University of California

UC Berkeley

UC Berkeley Electronic Theses and Dissertations bannerUC Berkeley

Dynamic Bayesian learning and optimization in portfolio choice models


We develop two dynamic Bayesian portfolio allocation models that address questions of learning and model uncertainty by taking model-specific shortcomings into account.

In our first model, we formulate a multi-period portfolio choice problem in which the investor is uncertain about parameters of the model, can learn these parameters over time from observing asset returns, but is also concerned about robustness. To address these concerns, we introduce an objective function which can be regarded as a Bayesian version of relative regret. The optimal portfolio is characterized and shown to involve a ``tilted'' posterior, where the tilting is defined in terms of a family of stochastic benchmarks. We have found this model to perform at least as well as a benchmark given the true market parameters, while outperforming it when the market assets have the same trend.

Our next model extends the Black-Litterman portfolio choice model by taking several potential errors into account. We extend Black-Litterman to multiple periods, which allows for us to take into account the pairs of expert forecasts and the realized return. By doing so, we can then perform inference on these experts and discover whether they may have any bias for or against any specific assets. We can also perform similar inference on the market equilibrium distribution, which is typically represented by the capital asset pricing model (CAPM). The result is a model that is analytically intractable but may be solved numerically via Gibbs sampling. Controlled tests show our model performs favorably when Black-Litterman's model assumptions about the market equilibrium and expert views are violated. Backtests shed light on the model's ability to account for CAPM's shortcomings.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View