Assessing Conceptual Complexity and Compressibility Using Information Gain and Mutual Information

被引:6
|
作者
Mathy, Fabien [1 ]
机构
[1] Univ Franche Comte, Besancon, France
关键词
D O I
10.20982/tqmp.06.1.p016
中图分类号
学科分类号
摘要
In this paper, a few basic notions stemming from information theory are presented with the intention of modeling the abstraction of relevant information in categorization tasks. In a categorization task, a single output variable is the basis for performing a dichotomic classification of objects that can be distinguished by a set of input variables which are more or less informative about the category to which the objects belong. At the beginning of the experiment, the target classification is unknown to learners who must select the most informative variables relative to the class in order to succeed in classifying the objects efficiently. I first show how the notion of entropy can be used to characterize basic psychological processes in learning. Then, I indicate how a learner might use information gain and mutual information - both based on entropy-to efficiently induce the shortest rule for categorizing a set of objects. Several basic classification tasks are studied in succession with the aim of showing that learning can improve as long as subjects are able to compress information. Referring to recent experimental results, I indicate in the Conclusion that these notions can account for both strategies and performance in subjects trying to simplify a learning process.
引用
收藏
页码:16 / 30
页数:15
相关论文
共 50 条
  • [31] A Novel Method for Ultrasound Elastography Using the Mutual Information and the Phase Information
    Wang, Jiaqi
    Huang, Qinghua
    Zhang, Xin
    2017 4TH INTERNATIONAL CONFERENCE ON SYSTEMS AND INFORMATICS (ICSAI), 2017, : 1066 - 1070
  • [32] Using Mutual Information Technique in Cross-Language Information Retrieval
    Sari, Syandra
    Adriani, Mirna
    DIGITAL LIBRARIES: UNIVERSAL AND UBIQUITOUS ACCESS TO INFORMATION, PROCEEDINGS, 2008, 5362 : 276 - +
  • [33] Mutual information bounded by Fisher information
    Gorecki, Wojciech
    Lu, Xi
    Macchiavello, Chiara
    Maccone, Lorenzo
    PHYSICAL REVIEW RESEARCH, 2025, 7 (02):
  • [34] Fisher Information and Mutual Information Constraints
    Barnes, Leighton Pate
    Ozgur, Ayfer
    2021 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2021, : 2179 - 2184
  • [35] Measuring Mutual Information Between All Pairs of Variables in Subquadratic Complexity
    Ferdosi, Mohsen
    Davoodi, Arash Gholami
    Mohimani, Hosein
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 108, 2020, 108 : 4399 - 4408
  • [36] Measuring Information Gain using Provenance
    Rawat, Shemon
    Lee, Seokki
    Jung, Taeho
    PROCEEDINGS OF 14TH INTERNATIONAL WORKSHOP ON THE THEORY AND PRACTICE OF PROVENANCE, TAPP 2022, 2022, : 40 - 43
  • [37] Conceptual Model: Personal Information Management Using Adaptive Information Systems
    Misnevs, Boriss
    Paskovskis, Sergejs
    BALTIC JOURNAL OF MODERN COMPUTING, 2025, 13 (01): : 129 - 156
  • [38] MEASURING COMPLEXITY USING INFORMATION FLUCTUATION
    BATES, JE
    SHEPARD, HK
    PHYSICS LETTERS A, 1993, 172 (06) : 416 - 425
  • [39] Scene Detection In Videos Using Mutual Information
    Huang, ShaoNian
    Zhang, ZhiYong
    MECHANICAL ENGINEERING AND GREEN MANUFACTURING, PTS 1 AND 2, 2010, : 920 - +
  • [40] IMAGE RESTORATION USING THE MUTUAL INFORMATION PRINCIPLE
    Papademetriou, Rallis C.
    Ketseoglou, Thomas J.
    Tzannes, Nicolaos S.
    KYBERNETES, 1987, 16 (04) : 229 - 233