Advanced Generalizations of Convex Function Inequalities: Implications for High-Order Divergence and Entropy Estimation in Information Theory
DOI:
https://doi.org/10.37256/cm.6420255733Keywords:
Jensen's inequality, Lah-Ribaric inequality, Lidstone identity, information theory, Zipf-Madelbrot lawAbstract
The inequalities involving convex function have many applications in analysis and in recent years it has helped in estimating many entropies and divergences that are used in information theory. In this paper, an inequality constructed by the two inequalities Jensen inequality and Lah-Ribaric inequality is considered. The non-negative difference of the this inequality are used to construct the non-negative difference. Two identities Abel-Gontscharoff and Montgomery identity at a time are used in non-negative differences to construct new identities. These identities are used to generalized the inequality for higher order convex function. Furthermore for the sake of application in information theory these generalized results are used to estimate Csiszer divergence, Shannon entropy, Kullback Leibler divergence and Zipf-Mandelbrot laws.
Downloads
Published
How to Cite
Issue
Section
Categories
License
Copyright (c) 2025 Muhammad Shahid Anwar, et al.

This work is licensed under a Creative Commons Attribution 4.0 International License.