More Reliable Neighborhood Contrastive Learning for Novel Class Discovery in Sensor-Based Human Activity Recognition

被引:3
作者
Zhang, Mingcong [1 ]
Zhu, Tao [1 ]
Nie, Mingxing [1 ]
Liu, Zhenyu [1 ]
机构
[1] Univ South China, Sch Comp Sci, Hengyang 421001, Peoples R China
关键词
human activity recognition; novel class discovery; neighborhood; contrastive learning; similarity; sensor;
D O I
10.3390/s23239529
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
Human Activity Recognition (HAR) systems have made significant progress in recognizing and classifying human activities using sensor data from a variety of sensors. Nevertheless, they have struggled to automatically discover novel activity classes within massive amounts of unlabeled sensor data without external supervision. This restricts their ability to classify new activities of unlabeled sensor data in real-world deployments where fully supervised settings are not applicable. To address this limitation, this paper presents the Novel Class Discovery (NCD) problem, which aims to classify new class activities of unlabeled sensor data by fully utilizing existing activities of labeled data. To address this problem, we propose a new end-to-end framework called More Reliable Neighborhood Contrastive Learning (MRNCL), which is a variant of the Neighborhood Contrastive Learning (NCL) framework commonly used in visual domain. Compared to NCL, our proposed MRNCL framework is more lightweight and introduces an effective similarity measure that can find more reliable k-nearest neighbors of an unlabeled query sample in the embedding space. These neighbors contribute to contrastive learning to facilitate the model. Extensive experiments on three public sensor datasets demonstrate that the proposed model outperforms existing methods in the NCD task in sensor-based HAR, as indicated by the fact that our model performs better in clustering performance of new activity class instances.
引用
收藏
页数:22
相关论文
共 50 条
  • [1] Contrastive Self-Supervised Learning for Sensor-Based Human Activity Recognition: A Review
    Chen, Hui
    Gouin-Vallerand, Charles
    Bouchard, Kevin
    Gaboury, Sebastien
    Couture, Melanie
    Bier, Nathalie
    Giroux, Sylvain
    IEEE ACCESS, 2024, 12 : 152511 - 152531
  • [2] Temporal Contrastive Learning for Sensor-Based Human Activity Recognition: A Self-Supervised Approach
    Chen, Xiaobing
    Zhou, Xiangwei
    Sun, Mingxuan
    Wang, Hao
    IEEE SENSORS JOURNAL, 2025, 25 (01) : 1839 - 1850
  • [3] Dynamic Temperature Scaling in Contrastive Self-Supervised Learning for Sensor-Based Human Activity Recognition
    Khaertdinov, Bulat
    Asteriadis, Stylianos
    Ghaleb, Esam
    IEEE TRANSACTIONS ON BIOMETRICS, BEHAVIOR, AND IDENTITY SCIENCE, 2022, 4 (04): : 498 - 507
  • [4] Active contrastive coding reducing label effort for sensor-based human activity recognition
    Li, Zhixin
    Liu, Hao
    Huan, Zhan
    Liang, Jiuzhen
    JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2024, 46 (02) : 3987 - 3999
  • [5] Invariant Feature Learning for Sensor-Based Human Activity Recognition
    Hao, Yujiao
    Zheng, Rong
    Wang, Boyu
    IEEE TRANSACTIONS ON MOBILE COMPUTING, 2022, 21 (11) : 4013 - 4024
  • [6] Sensor-Based Human Activity Recognition Using Adaptive Class Hierarchy
    Kondo, Kazuma
    Hasegawa, Tatsuhito
    SENSORS, 2021, 21 (22)
  • [7] Deep learning and model personalization in sensor-based human activity recognition
    Ferrari A.
    Micucci D.
    Mobilio M.
    Napoletano P.
    Journal of Reliable Intelligent Environments, 2023, 9 (01) : 27 - 39
  • [8] Resource-Efficient Continual Learning for Sensor-Based Human Activity Recognition
    Leite, Clayton Frederick Souza
    Xiao, Yu
    ACM TRANSACTIONS ON EMBEDDED COMPUTING SYSTEMS, 2022, 21 (06)
  • [9] Sensor-Based Human Activity Recognition with Spatio-Temporal Deep Learning
    Nafea, Ohoud
    Abdul, Wadood
    Muhammad, Ghulam
    Alsulaiman, Mansour
    SENSORS, 2021, 21 (06) : 1 - 20
  • [10] IDMatchHAR: Semi-Supervised Learning for Sensor-Based Human Activity Recognition Using Pretraining
    Takenaka, Koki
    Sakai, Shunsuke
    Hasegawa, Tatsuhito
    IEEE SENSORS LETTERS, 2025, 9 (04)