Assessing Conceptual Complexity and Compressibility Using Information Gain and Mutual Information

被引:6
|
作者
Mathy, Fabien [1 ]
机构
[1] Univ Franche Comte, Besancon, France
关键词
D O I
10.20982/tqmp.06.1.p016
中图分类号
学科分类号
摘要
In this paper, a few basic notions stemming from information theory are presented with the intention of modeling the abstraction of relevant information in categorization tasks. In a categorization task, a single output variable is the basis for performing a dichotomic classification of objects that can be distinguished by a set of input variables which are more or less informative about the category to which the objects belong. At the beginning of the experiment, the target classification is unknown to learners who must select the most informative variables relative to the class in order to succeed in classifying the objects efficiently. I first show how the notion of entropy can be used to characterize basic psychological processes in learning. Then, I indicate how a learner might use information gain and mutual information - both based on entropy-to efficiently induce the shortest rule for categorizing a set of objects. Several basic classification tasks are studied in succession with the aim of showing that learning can improve as long as subjects are able to compress information. Referring to recent experimental results, I indicate in the Conclusion that these notions can account for both strategies and performance in subjects trying to simplify a learning process.
引用
收藏
页码:16 / 30
页数:15
相关论文
共 50 条
  • [41] Nonrigid mammogram registration using mutual information
    Wirth, MA
    Narhan, J
    Gray, D
    MEDICAL IMAGING 2002: IMAGE PROCESSING, VOL 1-3, 2002, 4684 : 562 - 573
  • [42] Using mutual information for adaptive student assessments
    Liu, CL
    IEEE INTERNATIONAL CONFERENCE ON ADVANCED LEARNING TECHNOLOGIES, PROCEEDINGS, 2004, : 585 - 589
  • [43] Matching point features using mutual information
    Rangarajan, A
    Duncan, JS
    WORKSHOP ON BIOMEDICAL IMAGE ANALYSIS, PROCEEDINGS, 1998, : 172 - 181
  • [44] Exact Test of Independence Using Mutual Information
    Pethel, Shawn D.
    Hahs, Daniel W.
    ENTROPY, 2014, 16 (05): : 2839 - 2849
  • [45] Mutual Information Estimation using LSH Sampling
    Spring, Ryan
    Shrivastava, Anshumali
    PROCEEDINGS OF THE TWENTY-NINTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, : 2807 - 2815
  • [46] Automatic Chromosome Pairing Using Mutual Information
    Khmelinskii, Artem
    Ventura, Rodrigo
    Sanches, Joao
    2008 30TH ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY, VOLS 1-8, 2008, : 1918 - 1921
  • [47] Image similarity using mutual information of regions
    Russakoff, DB
    Tomasi, C
    Rohlfing, T
    Maurer, CR
    COMPUTER VISION - ECCV 2004, PT 3, 2004, 3023 : 596 - 607
  • [48] Active object recognition using mutual information
    Trujillo-Romero, F
    Ayala-Ramírez, V
    Marín-Hernández, A
    Devy, M
    MICAI 2004: ADVANCES IN ARTIFICIAL INTELLIGENCE, 2004, 2972 : 672 - 678
  • [49] Using mutual information as a cocitation similarity measure
    Lukun Zheng
    Scientometrics, 2019, 119 : 1695 - 1713
  • [50] Mutual Information Computation and Maximization Using GPU
    Lin, Yuping
    Medioni, Gerard
    2008 IEEE COMPUTER SOCIETY CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS, VOLS 1-3, 2008, : 1113 - 1118