نوع مقاله : مقاله پژوهشی
نویسندگان
گروه آمار، دانشگاه پیام نور، تهران، ایران.
کلیدواژهها
عنوان مقاله English
نویسندگان English
Purpose: In statistical data analysis and modeling, assessing the similarity or divergence between two probability distributions is of great importance. One of the most widely used metrics for this purpose is the Kullback-Leibler (KL) divergence, which quantifies the informational distance between distributions. This study aims to analyze the KL divergence between two normal distributions with equal variance and to compare the performance of different estimation methods for this measure.
Methodology: In this study, the exact value of the Kullback–Leibler divergence between two normal distributions with equal variance is first analytically derived, and then three estimation methods (maximum likelihood, Bayesian, and shrinkage) are proposed to estimate this measure. The performance of each estimator is evaluated via Monte Carlo simulations using the Mean Squared Error (MSE) criterion.
Findings: The simulation results indicate that the Bayesian estimator outperforms the MLE in terms of estimation accuracy. Furthermore, the shrinkage estimator performs best, achieving the lowest MSE among the three methods. This argument suggests that incorporating prior information or penalization techniques can significantly improve estimation quality.
Originality/Value: This study contributes to the literature by providing a detailed comparison of classical and modern estimation techniques for KL divergence in the context of normal distributions with equal variance. The novelty lies in integrating shrinkage methodology and demonstrating its superior performance, which is quantitatively validated through simulations. The findings have practical implications across fields such as machine learning, signal processing, and information theory.
کلیدواژهها English