Assessing Conceptual Complexity and Compressibility Using Information Gain and Mutual Information

被引:6
|
作者
Mathy, Fabien [1 ]
机构
[1] Univ Franche Comte, Besancon, France
关键词
D O I
10.20982/tqmp.06.1.p016
中图分类号
学科分类号
摘要
In this paper, a few basic notions stemming from information theory are presented with the intention of modeling the abstraction of relevant information in categorization tasks. In a categorization task, a single output variable is the basis for performing a dichotomic classification of objects that can be distinguished by a set of input variables which are more or less informative about the category to which the objects belong. At the beginning of the experiment, the target classification is unknown to learners who must select the most informative variables relative to the class in order to succeed in classifying the objects efficiently. I first show how the notion of entropy can be used to characterize basic psychological processes in learning. Then, I indicate how a learner might use information gain and mutual information - both based on entropy-to efficiently induce the shortest rule for categorizing a set of objects. Several basic classification tasks are studied in succession with the aim of showing that learning can improve as long as subjects are able to compress information. Referring to recent experimental results, I indicate in the Conclusion that these notions can account for both strategies and performance in subjects trying to simplify a learning process.
引用
收藏
页码:16 / 30
页数:15
相关论文
共 50 条
  • [21] SCMA Codebook Based on Optimization of Mutual Information and Shaping Gain
    Sharma, Sanjeev
    Deka, Kuntal
    Bhatia, Vimal
    Gupta, Anubha
    2018 IEEE GLOBECOM WORKSHOPS (GC WKSHPS), 2018,
  • [22] Hierarchical clustering using mutual information
    Kraskov, A
    Stögbauer, H
    Andrzejak, RG
    Grassberger, P
    EUROPHYSICS LETTERS, 2005, 70 (02): : 278 - 284
  • [23] A Fair Classifier Using Mutual Information
    Cho, Jaewoong
    Hwang, Gyeongjo
    Suh, Changho
    2020 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2020, : 2521 - 2526
  • [24] Detection thresholding using mutual information
    O Conaire, Ciaran
    O'Connor, Noel
    Cooke, Eddie
    Smeaton, Alan
    VISAPP 2006: PROCEEDINGS OF THE FIRST INTERNATIONAL CONFERENCE ON COMPUTER VISION THEORY AND APPLICATIONS, VOL 2, 2006, : 408 - +
  • [25] Signal processing using mutual information
    Hudson, John E.
    IEEE SIGNAL PROCESSING MAGAZINE, 2006, 23 (06) : 50 - +
  • [26] Explaining information gain and information gain ratio in information theory
    Li, Lingyun
    Zhang, Xiaofeng
    Xue, Mei
    ICIC Express Letters, 2013, 7 (08): : 2385 - 2391
  • [27] Assessing the Dependence Structure of the Components of Hybrid Time Series Processes Using Mutual Information
    Guha A.
    Sankhya B, 2015, 77 (2) : 256 - 292
  • [28] Assessing the Dependence Structure of the Components of Hybrid Time Series Processes Using Mutual Information
    Guha, Apratim
    SANKHYA-SERIES B-APPLIED AND INTERDISCIPLINARY STATISTICS, 2015, 77 : 256 - 292
  • [29] α-Mutual Information
    Verdu, Sergio
    2015 INFORMATION THEORY AND APPLICATIONS WORKSHOP (ITA), 2015, : 1 - 6
  • [30] Information Complexity Hypothesis: a Conceptual Framework for Reasoning on Pragmatics Issues
    Zyubin, Vladimir E.
    2008 IEEE REGION 8 INTERNATIONAL CONFERENCE ON COMPUTATIONAL TECHNOLOGIES IN ELECTRICAL AND ELECTRONICS ENGINEERING: SIBIRCON 2008, PROCEEDINGS, 2008, : 272 - 275