Log Likelihood Spectral Distance, Entropy Rate Power, and Mutual Information with Applications to Speech Coding
- Author(s): Gibson, Jerry D;
- Mahadevan, Preethi
- et al.
Published Web Locationhttps://doi.org/10.3390/e19090496
We provide a new derivation of the log likelihood spectral distance measure for signal processing using the logarithm of the ratio of entropy rate powers. Using this interpretation, we show that the log likelihood ratio is equivalent to the difference of two differential entropies, and further that it can be written as the difference of two mutual informations. These latter two expressions allow the analysis of signals via the log likelihood ratio to be extended beyond spectral matching to the study of their statistical quantities of differential entropy and mutual information. Examples from speech coding are presented to illustrate the utility of these new results. These new expressions allow the log likelihood ratio to be of interest in applications beyond those of just spectral matching for speech.