Information Theoretic Models for Dependence Analysis and Missing Data Estimation

The maximum entropy principle was used to define an information theoretic dependence measure in this chapter, which measures the amount of dependence among the attributes in a contingency table. A correlation has been discovered between the information theoretic measure of dependence and the Chi-square statistic. been talked about This information theoretic dependence measure has also been studied in terms of generalization. Finally, by considering practical problems with empirical data, Yate’s method and maximum entropy estimation of missing data in design of experiment have been described and illustrated. An algorithm for estimating missing values in a fuzzy matrix is defined and applied to missing data estimation in a contingency table.

Author (S) Details

Prof. D. S. Hooda
(Former PVC Kurukshetra Univeristy) Honorary Professor in Mathematics at GJU of Science & Technology, Hisar-125001, India.

Dr. Parmil Kumar
Department of Statistics, University of Jammu, Jammu, India.

View Book :-

Leave a Reply

Your email address will not be published. Required fields are marked *

Previous post Ebook Boundaries: The Risk of Transfiguration a Digital Book into a Non-Book
Next post Current Scenario of Mangrove Floristics of the Andaman and Nicobar Islands