Abstract—Minimum cross-entropy estimation is an extension to the maximum likelihood estimation for multinomial probabilities. Given a probability distribution {ri}ki=1, we show in this paper that the monotonic estimates {pi}ki=1 for the probability distribution by minimum cross-entropy are each given by the simple average of the given distribution values over some consecutive indexes. Results extend to the monotonic estimation for multivariate outcomes by generalized cross-entropy. These estimates are the exact solution for the corresponding constrained optimization and coincide with the monotonic estimates by least squares. A non-parametric algorithm for the exact solution is proposed. The algorithm is compared to the “pool adjacent violators” algorithm in least squares case for the isotonic regression problem. Applications to monotonic estimation of migration matrices and risk scales for multivariate outcomes are discussed.
Index Terms—Maximum likelihood, cross-entropy, least squares, isotonic regression, constrained optimization, multivariate risk scales.
Bill Huajian Yang is with Royal Bank of Canada, Canada (e-mail: hy02@yahoo.ca).
Cite: Bill Huajian Yang, "Monotonic Estimation for Probability Distribution and Multivariate Risk Scales by Constrained Minimum Generalized Cross-Entropy," International Journal of Machine Learning and Computing vol. 9, no. 4, pp. 506-512, 2019.
Copyright © 2019 by the authors. This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).