Learning concepts when instances never repeat

被引:9
作者
Homa, Donald [1 ]
Blair, Mark [2 ]
McClure, Samuel M. [1 ]
Medema, John [1 ]
Stone, Gregory [1 ]
机构
[1] Arizona State Univ, Dept Psychol, Tempe, AZ 85287 USA
[2] Simon Fraser Univ, Dept Psychol, Burnaby, BC, Canada
关键词
EXEMPLAR-BASED ACCOUNTS; DECISION RULES; CATEGORY SIZE; CATEGORIZATION; ABSTRACTION; CLASSIFICATION; RECOGNITION; DISSOCIATIONS; PROTOTYPES; EVOLUTION;
D O I
10.3758/s13421-018-0874-9
中图分类号
B84 [心理学];
学科分类号
04 ; 0402 ;
摘要
Three experiments explored the learning of categories where the training instances either repeated in each training block or appeared only once during the entire learning phase, followed by a classification transfer (Experiment 1) or a recognition transfer test (Experiments 2 and 3). Subjects received training instances from either two (Experiment 2) or three categories (Experiments 1-3) for either 15 or 20 training blocks. The results showed substantial learning in each experiment, with the notable result that learning was not slowed in the non-repeating condition in any of the three experiments. Furthermore, subsequent transfer was marginally better in the non-repeating condition. The recognition results showed that subjects in the repeat condition had substantial memory for the training instances, whereas subjects in the non-repeat condition had no measurable memory for the training instances, as measured either by hit and false-alarm rates or by signal detectability measures. These outcomes are consistent with prototype models of category learning, at least when patterns never repeat in learning, and place severe constraints on exemplar views that posit transfer mechanisms to stored individual traces. A formal model, which incorporates changing similarity relationships during learning, was shown to explain the major results.
引用
收藏
页码:395 / 411
页数:17
相关论文
共 50 条
  • [1] Exemplar-Model Account of Categorization and Recognition When Training Instances Never Repeat
    Hu, Mingjia
    Nosofsky, Robert M.
    JOURNAL OF EXPERIMENTAL PSYCHOLOGY-LEARNING MEMORY AND COGNITION, 2022, 48 (12) : 1947 - 1969
  • [2] Progress in Modeling Through Distributed Collaboration: Concepts, Tools and Category-Learning Examples
    Wills, Andy J.
    O'Connell, Garret
    Edmunds, Charlotte E. R.
    Inkster, Angus B.
    PSYCHOLOGY OF LEARNING AND MOTIVATION, VOL 66, 2017, 66 : 79 - 115
  • [3] Learning With Incremental Instances and Features
    Gu, Shilin
    Qian, Yuhua
    Hou, Chenping
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (07) : 9713 - 9727
  • [4] Recognition improvement through the optimisation of learning instances
    Wu, Hao
    Miao, Zhenjiang
    Chen, Jingyue
    Yang, Jie
    Gao, Xing
    IET COMPUTER VISION, 2015, 9 (03) : 419 - 427
  • [5] When are concepts comparable across minds?
    Canessa, Enrique Carlos
    Chaigneau, Sergio E.
    QUALITY & QUANTITY, 2016, 50 (03) : 1367 - 1384
  • [6] Object-Label-Order Effect When Learning From an Inconsistent Source
    Ma, Timmy
    Komarova, Natalia L.
    COGNITIVE SCIENCE, 2019, 43 (08)
  • [7] Learning incommensurate concepts
    Clatterbuck, Hayley
    Gentry, Hunter
    SYNTHESE, 2025, 205 (03)
  • [8] Selecting reliable instances based on evidence theory for transfer learning
    Lv, Ying
    Zhang, Bofeng
    Yue, Xiaodong
    Denoeux, Thierry
    Yue, Shan
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 250
  • [9] Lifelong learning with selective attention over seen classes and memorized instances
    Wang, Zhijun
    Wang, Hongxing
    NEURAL COMPUTING & APPLICATIONS, 2024, 36 (15) : 8473 - 8484
  • [10] A Multi-view Learning Approach to the Discovery of Deviant Process Instances
    Cuzzocrea, Alfredo
    Folino, Francesco
    Guarascio, Massimo
    Pontieri, Luigi
    ON THE MOVE TO MEANINGFUL INTERNET SYSTEMS: OTM 2015 CONFERENCES, 2015, 9415 : 146 - 165