Novel Generalized Additive Information Measure with Coding Theorems and Its Applicability
Main Article Content
Abstract
In this article, a novel additive information measure and a new average code-word length, along with the formulation of noiseless coding theorems for discrete channels is introduced. Additionally, the measures introduced in this study generalize several well-established measures in information theory. Empirical data is used to validate the findings through the implementation of Huffman and Shannon-Fano coding schemes. Furthermore, the key properties of the proposed information measure, including its behavior with respect to uncertainty and biasedness are investigated. Specifically, it is shown that as the measure increases, it captures greater uncertainty and reflects reduced biasedness in the distribution and vice versa and finally, the monotonicity of the proposed measure using a real-life data set is also examined.
Article Details

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.