Frugal Following: Power Thrifty Object Detection and Tracking for Mobile Augmented Reality

被引:64
作者
Apicharttrisorn, Kittipat [1 ]
Ran, Xukan [1 ]
Chen, Jiasi [1 ]
Krishnamurthy, Srikanth, V [1 ]
Roy-Chowdhury, Amit K. [1 ]
机构
[1] Univ Calif Riverside, Riverside, CA 92521 USA
来源
PROCEEDINGS OF THE 17TH CONFERENCE ON EMBEDDED NETWORKED SENSOR SYSTEMS (SENSYS '19) | 2019年
基金
美国国家科学基金会;
关键词
Mobile Augmented Reality; Energy Efficiency; Object Detection and Tracking; Convolutional Neural Network; PERFORMANCE;
D O I
10.1145/3356250.3360044
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Accurate tracking of objects in the real world is highly desirable in Augmented Reality (AR) to aid proper placement of virtual objects in a user's view. Deep neural networks (DNNs) yield high precision in detecting and tracking objects, but they are energy-heavy and can thus be prohibitive for deployment on mobile devices. Towards reducing energy drain while maintaining good object tracking precision, we develop a novel software framework called MARLIN. MARLIN only uses a DNN as needed, to detect new objects or recapture objects that significantly change in appearance. It employs lightweight methods in between DNN executions to track the detected objects with high fidelity. We experiment with several baseline DNN models optimized for mobile devices, and via both offline and live object tracking experiments on two different Android phones (one utilizing a mobile GPU), we show that MARLIN compares favorably in terms of accuracy while saving energy significantly. Specifically, we show that MARLIN reduces the energy consumption by up to 73.3% (compared to an approach that executes the best baseline DNN continuously), and improves accuracy by up to 19x (compared to an approach that infrequently executes the same best baseline DNN). Moreover, while in 75% or more cases, MARLIN incurs at most a 7.36% reduction in location accuracy (using the common IOU metric), in more than 46% of the cases, MARLIN even improves the IOU compared to the continuous, best DNN approach.
引用
收藏
页码:96 / 109
页数:14
相关论文
共 71 条
  • [31] FlowNet 2.0: Evolution of Optical Flow Estimation with Deep Networks
    Ilg, Eddy
    Mayer, Nikolaus
    Saikia, Tonmoy
    Keuper, Margret
    Dosovitskiy, Alexey
    Brox, Thomas
    [J]. 30TH IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2017), 2017, : 1647 - 1655
  • [32] Quantization and Training of Neural Networks for Efficient Integer-Arithmetic-Only Inference
    Jacob, Benoit
    Kligys, Skirmantas
    Chen, Bo
    Zhu, Menglong
    Tang, Matthew
    Howard, Andrew
    Adam, Hartwig
    Kalenichenko, Dmitry
    [J]. 2018 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2018, : 2704 - 2713
  • [33] Jain P., 2015, P 13 ANN INT C MOBIL, P331
  • [34] Low Bandwidth Offload for Mobile AR
    Jain, Puneet
    Manweiler, Justin
    Choudhury, Romit Roy
    [J]. PROCEEDINGS OF THE 12TH INTERNATIONAL CONFERENCE ON EMERGING NETWORKING EXPERIMENTS AND TECHNOLOGIES (CONEXT'16), 2016, : 237 - 251
  • [35] Jindal Amit, 2018, ENABLING FULL BODY A
  • [36] Object Detection from Video Tubelets with Convolutional Neural Networks
    Kang, Kai
    Ouyang, Wanli
    Li, Hongsheng
    Wang, Xiaogang
    [J]. 2016 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2016, : 817 - 825
  • [37] ImageNet Classification with Deep Convolutional Neural Networks
    Krizhevsky, Alex
    Sutskever, Ilya
    Hinton, Geoffrey E.
    [J]. COMMUNICATIONS OF THE ACM, 2017, 60 (06) : 84 - 90
  • [38] Kumawat Kashish, 2017, SNAPDRAGON 835 REV B
  • [39] Lane ND, 2016, 2016 15TH ACM/IEEE INTERNATIONAL CONFERENCE ON INFORMATION PROCESSING IN SENSOR NETWORKS (IPSN)
  • [40] Lin I, 2016, 2016 INTERNATIONAL SYMPOSIUM ON VLSI DESIGN, AUTOMATION AND TEST (VLSI-DAT)