A fusion framework for vision-based indoor occupancy estimation

被引:16
作者
Sun, Kailai [1 ]
Liu, Peng [1 ]
Xing, Tian [1 ]
Zhao, Qianchuan [1 ]
Wang, Xinwei [1 ]
机构
[1] Tsinghua Univ, Ctr Intelligent & Networked Syst, Dept Automat, BNRist, Beijing 100084, Peoples R China
基金
中国国家自然科学基金;
关键词
Occupancy estimation; Cameras; Scene knowledge fusion; Heterogeneous fusion; People counting; OFFICE; VIDEOS; SYSTEM;
D O I
10.1016/j.buildenv.2022.109631
中图分类号
TU [建筑科学];
学科分类号
0813 ;
摘要
Building occupancy information is essential for energy saving, comfort improvement, and security management. Existing vision-based indoor occupancy measurement methods have achieved remarkable progress; however, they mainly focus on single-vision situations, i.e., cameras at room entrances or interiors. These methods struggle to achieve high accuracy because of the complex indoor environments. For example, they often fail to detect occupants and generate many false positives. In this paper, to address these issues, we propose a novel fusion framework for occupancy detection and estimation based on two different perspectives. First, we design a head detection method combined with indoor scene knowledge to filter false positives and recover missed detection. Second, we propose a two-vision entrance counting method to refine the predicted results. Finally, we propose a cumulative error clearing strategy named dynamic Bayesian fusion (DBF), which integrates entrance counting and static estimation. Our framework achieves superior performance through ablation studies compared to existing methods on practical building surveillance videos, with occupancy estimation SCOREs of 99.2%, 98.5%, and 94.9%. Our framework can clear cumulative errors and stabilize estimation results. Practical experiments validate its potential for building energy saving and comfort improvement. The code is available at https://github.com/kailaisun/FFO.
引用
收藏
页数:13
相关论文
共 54 条
[41]   Context-aware CNNs for person head detection [J].
Tuan-Hung Vu ;
Osokin, Anton ;
Laptev, Ivan .
2015 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2015, :2893-2901
[42]  
Vora A, 2019, Arxiv, DOI arXiv:1809.08766
[43]   CSPNet: A New Backbone that can Enhance Learning Capability of CNN [J].
Wang, Chien-Yao ;
Liao, Hong-Yuan Mark ;
Wu, Yueh-Hua ;
Chen, Ping-Yang ;
Hsieh, Jun-Wei ;
Yeh, I-Hau .
2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS (CVPRW 2020), 2020, :1571-1580
[44]   CamShift guided particle filter for visual tracking [J].
Wang, Zhaowen ;
Yang, Xiaokang ;
Xu, Yi ;
Yu, Songyu .
PATTERN RECOGNITION LETTERS, 2009, 30 (04) :407-413
[45]  
Wojke N, 2017, IEEE IMAGE PROC, P3645, DOI 10.1109/ICIP.2017.8296962
[46]   A Low-Power Electric-Mechanical Driving Approach for True Occupancy Detection Using a Shuttered Passive Infrared Sensor [J].
Wu, Libo ;
Wang, Ya .
IEEE SENSORS JOURNAL, 2019, 19 (01) :47-57
[47]   Occupancy Detection and Localization by Monitoring Nonlinear Energy Flow of a Shuttered Passive Infrared Sensor [J].
Wu, Libo ;
Wang, Ya ;
Liu, Haili .
IEEE SENSORS JOURNAL, 2018, 18 (21) :8656-8666
[48]   Review on occupant-centric thermal comfort sensing, predicting, and controlling [J].
Xie, Jiaqing ;
Li, Haoyang ;
Li, Chuting ;
Zhang, Jingsi ;
Luo, Maohui .
ENERGY AND BUILDINGS, 2020, 226
[49]   Comparison of different occupancy counting methods for single system-single zone applications [J].
Yang, Junjing ;
Pantazaras, Alexandros ;
Chaturvedi, Karn Ashokkumar ;
Chandran, Arun Kumar ;
Santamouris, Mat ;
Lee, Siew Eang ;
Tham, Kwok Wai .
ENERGY AND BUILDINGS, 2018, 172 :221-234
[50]   A systematic approach to occupancy modeling in ambient sensor-rich buildings [J].
Yang, Zheng ;
Li, Nan ;
Becerik-Gerber, Burcin ;
Orosz, Michael .
SIMULATION-TRANSACTIONS OF THE SOCIETY FOR MODELING AND SIMULATION INTERNATIONAL, 2014, 90 (08) :960-977