Error estimate and convergence analysis of moment-preserving discrete approximations of continuous distributions
Skip to main content
eScholarship
Open Access Publications from the University of California

UC San Diego

UC San Diego Previously Published Works bannerUC San Diego

Error estimate and convergence analysis of moment-preserving discrete approximations of continuous distributions

Published Web Location

https://doi.org/10.1063/1.4903706
Abstract

The maximum entropy principle is a powerful tool for solving underdetermined inverse problems. This paper considers the problem of discretizing a continuous distribution, which arises in various applied fields. We obtain the approximating distribution by minimizing the Kullback-Leibler information (relative entropy) of the unknown discrete distribution relative to an initial discretization based on a quadrature formula subject to some moment constraints. We study the theoretical error bound and the convergence of this approximation method as the number of discrete points increases. We prove that (i) the theoretical error bound of the approximate expectation of any bounded continuous function has at most the same order as the quadrature formula we start with, and (ii) the approximate discrete distribution weakly converges to the given continuous distribution. Moreover, we present some numerical examples that show the advantage of the method and apply to numerically solving an optimal portfolio problem.

Many UC-authored scholarly publications are freely available on this site because of the UC's open access policies. Let us know how this access is important for you.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View