Divergence minimization under prior inequality constraints

I. Csiszár, G. Tusnády, M. Ispány, E. Verdes, G. Michaletzky, T. Rudas

Research output: Contribution to journalArticle

1 Citation (Scopus)


Motivated by problems in robust statistics we first give a simple proof of the following: Given a probability measure P and positive measures μ < ν, the γ-divergence from P of probability measures Q satisfying μ ≤ Q or μ ≤ Q ≤ ν is minimized by an explicitly determined Q* not depending on (the convex function) γ. Next we address γ-divergence minimization under the above inequality constraint and additional moment constraints.

Original languageEnglish
Number of pages1
JournalIEEE International Symposium on Information Theory - Proceedings
Publication statusPublished - Jan 1 2001


ASJC Scopus subject areas

  • Theoretical Computer Science
  • Information Systems
  • Modelling and Simulation
  • Applied Mathematics

Cite this