A Jensen–Gini measure of divergence with application in parameter estimation
Abstract
In the present paper, we define a new measure of divergence between two probability distribution functions F1 and F2 based on Jensen inequality and Gini mean difference. The proposed measure, which we call it Jensen–Gini measure of divergence (JG), is symmetric and its square root is a metric. We show that the JG can be represented as a mixture of Cramér’s distance (CD) between the two distributions F1 and F2. A generalization of JG for measuring the overall difference between several probability distributions is also proposed. The proposed JG measure of divergence is applied to estimate the unknown parameters of a probability distribution. We consider a statistical model F(x; θ) , where the parameter θ∈ Θ is assumed to be unknown. Based on a random sample drawn from the distribution, we consider the JG between the distribution F(x; θ) and the empirical estimator of the distribution. Then, we estimate the parameter θ as a value in the parameter space Θ which minimizes the JG between the distribution F(x; θ) and its empirical estimator. We call this estimator as minimum Jensen–Gini estimator (MJGE) of the parameter. Several properties of MJGE are investigated. It is shown that the MJGE is in the class of generalized estimating equations. Asymptotic properties of MJGE such as consistency and normality are explored. Some simulation studies are performed to evaluate the performance of MJGE. © 2017, Sapienza Università di Roma.