Real-time data visual monitoring of triboelectric nanogenerators enabled by Deep learning

被引:18
作者
Zhang, Huiya [1 ]
Liu, Tao [1 ]
Zou, Xuelian [1 ]
Zhu, Yunpeng [1 ]
Chi, Mingchao [1 ]
Wu, Di [1 ]
Jiang, Keyang [1 ]
Zhu, Sijia [1 ]
Zhai, Wenxia [1 ]
Wang, Shuangfei [1 ]
Nie, Shuangxi [1 ]
Wang, Zhiwei [1 ]
机构
[1] Guangxi Univ, Coll Light Ind & Food Engn, Key Lab Clean Pulp & Papermaking & Pollut Control, Nanning 530004, Peoples R China
基金
中国国家自然科学基金;
关键词
Triboelectric nanogenerator; Deep learning; Self-powered sensing; Real-time monitoring; DATA-ACQUISITION; DECISION-MAKING; RECOGNITION; NETWORKS; INTERNET; SYSTEM; SENSOR; SMART; PERFORMANCE; THINGS;
D O I
10.1016/j.nanoen.2024.110186
中图分类号
O64 [物理化学(理论化学)、化学物理学];
学科分类号
070304 ; 081704 ;
摘要
The rapid advancement of smart sensors and logic algorithms has propelled the widespread adoption of the Internet of Things (IoT) and expedited the advent of the intelligent era. The integration of triboelectric nanogenerator (TENG) sensors with Deep learning (DL) leverages unique advantages of TENG such as self-powered sensing, high sensitivity, and broad applicability, along with DL's robust data processing capabilities to effectively, efficiently, and visually monitor various relevant signals. This amalgamation exhibits significantly superior sensing performance and immense developmental potential, finding extensive utility in domains like smart homes, healthcare system, environmental monitoring, among others. Currently, the synergistic working principle of integrating these two technologies remains insufficiently elucidated. This review presents a comprehensive overview of cutting-edge DL techniques and related research aimed at enhancing real-time visual monitoring of TENG. Specifically, it focuses on DL algorithms such as Convolutional Neural Networks (CNN), Recurrent Neural Networks (RNN), and Long Short-Term Memory (LSTM) for processing intricate TENG-generated datasets. Furthermore, this review outlines the advantages and synergistic mechanisms resulting from the integration of DL with TENG sensors, providing a comprehensive summary of their latest applications in various fields requiring real-time data visual monitoring. Finally, it analyzes the prospects, challenges, and countermeasures associated with the integrated development of TENG and DL while offering a comprehensive theoretical foundation and practical guidance for future advancements in this field.
引用
收藏
页数:23
相关论文
共 228 条
[1]  
Alzubaidi L., 2021, J. Big Data, V8, DOI [10.1186/140537-021-00444-8, DOI 10.1186/140537-021-00444-8]
[2]   Deep Learning Enabled Neck Motion Detection Using a Triboelectric Nanogenerator [J].
An, Shanshan ;
Pu, Xianjie ;
Zhou, Shiyi ;
Wu, Yihan ;
Li, Gui ;
Xing, Pengcheng ;
Zhang, Yangsong ;
Hu, Chenguo .
ACS NANO, 2022, 16 (06) :9359-9367
[3]  
Baduge S.K., 2022, Automa, Constr., V141, DOI [10.1016/jAautcon2022.104440, DOI 10.1016/JAAUTCON2022.104440]
[4]   What's new in ICU in 2050: big data and machine learning [J].
Bailly, Sebastien ;
Meyfroidt, Geert ;
Timsit, Jean-Francois .
INTENSIVE CARE MEDICINE, 2018, 44 (09) :1524-1527
[5]   FPGA-Based Reconfigurable Data Acquisition System for Industrial Sensors [J].
Bao, Shuang ;
Yan, Hairong ;
Chi, Qingping ;
Pang, Zhibo ;
Sun, Yuying .
IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2017, 13 (04) :1503-1512
[6]   Deep learning: a statistical viewpoint [J].
Bartlett, Peter L. ;
Montanari, Andrea ;
Rakhlin, Alexander .
ACTA NUMERICA, 2021, 30 :87-201
[7]   Multitask LSTM Model for Human Activity Recognition and Intensity Estimation Using Wearable Sensor Data [J].
Barut, Onur ;
Zhou, Li ;
Luo, Yan .
IEEE INTERNET OF THINGS JOURNAL, 2020, 7 (09) :8760-8768
[8]   Speeded-Up Robust Features (SURF) [J].
Bay, Herbert ;
Ess, Andreas ;
Tuytelaars, Tinne ;
Van Gool, Luc .
COMPUTER VISION AND IMAGE UNDERSTANDING, 2008, 110 (03) :346-359
[10]   Effects of Loss Function Choice on One-Shot HSI Target Detection With Paired Neural Networks [J].
Benham, Kevin ;
Lewis, Phillip ;
Richardson, Joseph C. .
IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, 2024, 17 :4743-4750