Bioinspired In-Sensor Multimodal Fusion for Enhanced Spatial and Spatiotemporal Association

被引:5
|
作者
Ma, Sijie [1 ,2 ,3 ]
Zhou, Yue [1 ,2 ,3 ]
Wan, Tianqing [1 ,2 ,3 ]
Ren, Qinqi [1 ,2 ,3 ]
Yan, Jianmin [1 ,2 ,3 ]
Fan, Lingwei [1 ,2 ,3 ]
Yuan, Huanmei [4 ]
Chan, Mansun [4 ]
Chai, Yang [1 ,2 ,3 ]
机构
[1] Hong Kong Polytech Univ, Dept Appl Phys, Kowloon, Hong Kong 999077, Peoples R China
[2] Hong Kong Polytech Univ, Joint Res Ctr Microelect, Kowloon, Hong Kong 999077, Peoples R China
[3] Hong Kong Polytech Univ, Joint Res Ctr Microelect, Kowloon, Hong Kong 999077, Peoples R China
[4] Hong Kong Univ Sci & Technol, Dept Elect & Comp Engn, Kowloon, Hong Kong 999077, Peoples R China
关键词
in-sensor computing; neuromorphic computing; multimodal integration; two-dimensional (2D) semiconductor; edge computing; floating gate transistor; INTEGRATION; GENERATION;
D O I
10.1021/acs.nanolett.4c01727
中图分类号
O6 [化学];
学科分类号
0703 ;
摘要
Multimodal perception can capture more precise and comprehensive information compared with unimodal approaches. However, current sensory systems typically merge multimodal signals at computing terminals following parallel processing and transmission, which results in the potential loss of spatial association information and requires time stamps to maintain temporal coherence for time-series data. Here we demonstrate bioinspired in-sensor multimodal fusion, which effectively enhances comprehensive perception and reduces the level of data transfer between sensory terminal and computation units. By adopting floating gate phototransistors with reconfigurable photoresponse plasticity, we realize the agile spatial and spatiotemporal fusion under nonvolatile and volatile photoresponse modes. To realize an optimal spatial estimation, we integrate spatial information from visual-tactile signals. For dynamic events, we capture and fuse in real time spatiotemporal information from visual-audio signals, realizing a dance-music synchronization recognition task without a time-stamping process. This in-sensor multimodal fusion approach provides the potential to simplify the multimodal integration system, extending the in-sensor computing paradigm.
引用
收藏
页码:7091 / 7099
页数:9
相关论文
共 50 条
  • [1] Bioinspired in-sensor visual adaptation for accurate perception
    Fuyou Liao
    Zheng Zhou
    Beom Jin Kim
    Jiewei Chen
    Jingli Wang
    Tianqing Wan
    Yue Zhou
    Anh Tuan Hoang
    Cong Wang
    Jinfeng Kang
    Jong-Hyun Ahn
    Yang Chai
    Nature Electronics, 2022, 5 : 84 - 91
  • [2] Bioinspired in-sensor visual adaptation for accurate perception
    Liao, Fuyou
    Zhou, Zheng
    Kim, Beom Jin
    Chen, Jiewei
    Wang, Jingli
    Wan, Tianqing
    Zhou, Yue
    Hoang, Anh Tuan
    Wang, Cong
    Kang, Jinfeng
    Ahn, Jong-Hyun
    Chai, Yang
    NATURE ELECTRONICS, 2022, 5 (02) : 84 - 91
  • [3] Optoelectronic graded neurons for bioinspired in-sensor motion perception
    Jiewei Chen
    Zheng Zhou
    Beom Jin Kim
    Yue Zhou
    Zhaoqing Wang
    Tianqing Wan
    Jianmin Yan
    Jinfeng Kang
    Jong-Hyun Ahn
    Yang Chai
    Nature Nanotechnology, 2023, 18 : 882 - 888
  • [4] Bioinspired in-sensor vision processing network for action recognition
    Chem, Jiewei
    Zhou, Yue
    Ahn, Jong-Hyun
    Chai, Yang
    8TH IEEE ELECTRON DEVICES TECHNOLOGY & MANUFACTURING CONFERENCE, EDTM 2024, 2024, : 478 - 480
  • [5] Optoelectronic graded neurons for bioinspired in-sensor motion perception
    Chen, Jiewei
    Zhou, Zheng
    Kim, Beom Jin
    Zhou, Yue
    Wang, Zhaoqing
    Wan, Tianqing
    Yan, Jianmin
    Kang, Jinfeng
    Ahn, Jong-Hyun
    Chai, Yang
    NATURE NANOTECHNOLOGY, 2023, 18 (08) : 882 - +
  • [6] Multimodal Data Fusion of Spatial Fields in Sensor Networks
    Zhang, Pengfei
    Peters, Gareth W.
    Nevat, Ido
    Teo, Keng Boon
    Wang, Yixin
    2019 IEEE SENSORS, 2019,
  • [7] Bioinspired in-sensor spectral adaptation for perceiving spectrally distinctive features
    Ouyang, Bangsen
    Wang, Jialiang
    Zeng, Guang
    Yan, Jianmin
    Zhou, Yue
    Jiang, Xixi
    Shao, Bangjie
    Chai, Yang
    NATURE ELECTRONICS, 2024, 7 (08): : 705 - 713
  • [8] MXene-ZnO Memristor for Multimodal In-Sensor Computing
    Wang, Yan
    Gong, Yue
    Yang, Lin
    Xiong, Ziyu
    Lv, Ziyu
    Xing, Xuechao
    Zhou, Ye
    Zhang, Bing
    Su, Chenliang
    Liao, Qiufan
    Han, Su-Ting
    ADVANCED FUNCTIONAL MATERIALS, 2021, 31 (21)
  • [9] Research on Enhanced Gait Phase Segmentation Based on Multimodal Spatiotemporal Information Fusion
    Zhang, Hao
    Liu, Xiaofeng
    Li, Jie
    Pan, Jia
    Loo, Chu Kiong
    Cangelosi, Angelo
    IEEE INTERNET OF THINGS JOURNAL, 2025, 12 (07): : 8773 - 8788
  • [10] In-Sensor Tactile Fusion and Logic for Accurate Intention Recognition
    Huang, Zijian
    Yu, Shifan
    Xu, Yijing
    Cao, Zhicheng
    Zhang, Jinwei
    Guo, Ziquan
    Wu, Tingzhu
    Liao, Qingliang
    Zheng, Yuanjin
    Chen, Zhong
    Liao, Xinqin
    ADVANCED MATERIALS, 2024, 36 (35)