Quantcast

Bounds On Triangular Discrimination, Harmonic Mean and Symmetric Chi-square Divergences

Research paper by Inder Jeet Taneja

Indexed on: 11 May '05Published on: 11 May '05Published in: Mathematics - Probability



Abstract

There are many information and divergence measures exist in the literature on information theory and statistics. The most famous among them are Kullback-Leiber relative information and Jeffreys J-divergence. The measures like, Bhattacharya distance, Hellinger discrimination, Chi-square divergence, triangular discrimination and harmonic mean divergence are also famous in the literature on statistics. In this paper we have obtained bounds on triangular discrimination and symmetric chi-square divergence in terms of relative information of type s using Csiszar's f-divergence. A relationship among triangular discrimination and harmonic mean divergence is also given.