Unsupervised Hierarchical Temporal Abstraction by Simultaneously Learning Expectations and Representations

被引:0
|
作者
Metcalf, Katherine [1 ]
Leake, David [1 ]
机构
[1] Indiana Univ, Comp Sci Dept, Bloomington, IN 47405 USA
来源
PROCEEDINGS OF THE TWENTY-EIGHTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE | 2019年
关键词
VOTING EXPERTS; PERCEPTION; SEGMENTATION; ALGORITHM;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper presents ENHAnCE, an algorithm that simultaneously learns a predictive model of the input stream and generates representations of the concepts being observed. Following cognitively-inspired models of event segmentation, ENHAnCE uses expectation violations to identify boundaries between temporally extended patterns. It applies its expectation-driven process at multiple levels of temporal granularity to produce a hierarchy of predictive models that enable it to identify concepts at multiple levels of temporal abstraction. Evaluations show that the temporal abstraction hierarchies generated by ENHAnCE closely match hand-coded hierarchies for the test data streams. Given language data streams, ENHAnCE learns a hierarchy of predictive models that capture basic units of both spoken and written language: morphemes, lexemes, phonemes, syllables, and words.
引用
收藏
页码:3144 / 3150
页数:7
相关论文
共 20 条
  • [1] Learning of timing patterns and the development of temporal expectations
    Tillmann, Barbara
    Stevens, Catherine
    Keller, Peter E.
    PSYCHOLOGICAL RESEARCH-PSYCHOLOGISCHE FORSCHUNG, 2011, 75 (03): : 243 - 258
  • [2] The Short-Context Priority of Emergent Representations in Unsupervised Learning
    Gan, Ping
    Weng, Juyang
    2014 10TH INTERNATIONAL CONFERENCE ON NATURAL COMPUTATION (ICNC), 2014, : 30 - 35
  • [3] Learning Event Representations by Encoding the Temporal Context
    Dias, Catarina
    Dimiccoli, Mariella
    COMPUTER VISION - ECCV 2018 WORKSHOPS, PT III, 2019, 11131 : 587 - 596
  • [4] Unsupervised Learning of Temporal Abstractions With Slot-Based Transformers
    Gopalakrishnan, Anand
    Irie, Kazuki
    Schmidhuber, Jurgen
    van Steenkiste, Sjoerd
    NEURAL COMPUTATION, 2023, 35 (04) : 593 - 626
  • [5] A Novel Hierarchical High-Dimensional Unsupervised Active Learning Method
    Klidbary, Sajad Haghzad
    Javadian, Mohammad
    INTERNATIONAL JOURNAL OF COMPUTATIONAL INTELLIGENCE SYSTEMS, 2024, 17 (01)
  • [6] Learning-Based Hierarchical Graph for Unsupervised Matting and Foreground Estimation
    Tseng, Chen-Yu
    Wang, Sheng-Jyh
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2014, 23 (12) : 4941 - 4953
  • [7] Time-aware deep reinforcement learning with multi-temporal abstraction
    Kim, Yeo Jin
    Chi, Min
    APPLIED INTELLIGENCE, 2023, 53 (17) : 20007 - 20033
  • [8] Learning Low-Dimensional Temporal Representations with Latent Alignments
    Su, Bing
    Wu, Ying
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2020, 42 (11) : 2842 - 2857
  • [9] Visual statistical learning of temporal structures at different hierarchical levels
    Jun, Jihyang
    Chong, Sang Chul
    ATTENTION PERCEPTION & PSYCHOPHYSICS, 2016, 78 (05) : 1308 - 1323
  • [10] Scalable High-Performance Image Registration Framework by Unsupervised Deep Feature Representations Learning
    Wu, Guorong
    Kim, Minjeong
    Wang, Qian
    Munsell, Brent C.
    Shen, Dinggang
    IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING, 2016, 63 (07) : 1505 - 1516