Skip to main content
Open Access Publications from the University of California

Quantum and Fisher information from the Husimi and related distributions

  • Author(s): Slater, Paul B
  • et al.

The two principal/immediate influences-which we seek to interrelate here-upon the undertaking of this study are papers of Zyczkowski and Slomczynski [J. Phys. A 34, 6689 (2001)] and of Petz and Sudar [J. Math. Phys. 37, 2262 (1996)]. In the former work, a metric (the Monge one, specifically) over generalized Husimi distributions was employed to define a distance between two arbitrary density matrices. In the Petz-Sudar work (completing a program of Chentsov), the quantum analog of the (classically unique) Fisher information (monotone) metric of a probability simplex was extended to define an uncountable infinitude of Riemannian (also monotone) metrics on the set of positive definite density matrices. We pose here the questions of what is the specific/unique Fisher information metric for the (classically defined) Husimi distributions and how does it relate to the infinitude of (quantum) metrics over the density matrices of Petz and Sudar? We find a highly proximate (small relative entropy) relationship between the probability distribution (the quantum Jeffreys' prior) that yields quantum universal data compression, and that which (following Clarke and Barron) gives its classical counterpart. We also investigate the Fisher information metrics corresponding to the escort Husimi, positive-P and certain Gaussian probability distributions, as well as, in some sense, the discrete Wigner pseudoprobability. The comparative noninformativity of prior probability distributions-recently studied by Srednicki [Phys. Rev. A 71, 052107 (2005)]-formed by normalizing the volume elements of the various information metrics, is also discussed in our context. (c) 2006 American Institute of Physics.

Many UC-authored scholarly publications are freely available on this site because of the UC's open access policies. Let us know how this access is important for you.

Main Content
Current View