On ϕ-Divergence and Its Applications

The Shannon entropy and the associated Kullback—Leibler divergence measure (relative entropy) between probability measures are fundamental from the applications point of view, and arise naturally from statistical concepts. While restricted to information theory and statistics in the work of Shannon [38] and Kullback—Leibler [31] in the early 1950s, the concept of entropy started to be used in optimization modeling for various problems of engineering and management science. An early work on entropy optimization problems over linear constraints sets (equality or inequality) was studied by Chames and Cooper [16] via convex programming techniques. Many other useful applications in a diversity of problems such as traffic engineering, game theory, information theory, and marketing were developed by Chames et al. (see, e.g., [15,17-19] and the references therein). For further general information and applications, we refer the reader to Frieden [23] and Kay and Marple [30] for engineering problems and to Lev and Theil [32] and more recently to Theil and Feibig [39] for economic and finance models.