Towards an immersive and safer driving experience using computer vision integrated with encoded vibro-tactile feedback

被引:0
作者
Mukherjee R. [1 ]
Mahato D.K. [2 ]
Yadav S. [2 ]
Pundir A. [2 ]
Saxena G.J. [2 ]
机构
[1] School of Electrical Engineering and Computer Science, University of Ottawa, Ottawa, ON
[2] Department of Electronics, Maharaja Agrasen College, University of Delhi, New Delhi
关键词
Human-centred computing; Image detection; Immersive reality; Object tracking; SURF; Vibro-tactile; Vision-totouch;
D O I
10.1504/IJVAS.2020.108406
中图分类号
学科分类号
摘要
This paper claims to set up an immersive, responsive vehicle driving system and mechanism for an assisted driving technology. The purpose is to expand the sensory horizon of humans while driving and is motivated by absence of any such system in real world. The system can control and direct an assembly of electronic devices in real time, through usage of an image acquisition subsystem, an object-recognition and tracking algorithm and a haptic modelling subsystem working in-tandem with the user. The object tracking subsystem operates in real time to determine the current position of a vehicle in front by using a camera and continuously updates it in a live video feed, while also identifying and tracking the moving or stationary vehicle. The haptic system, which is integrated with the tracking system, has been programmed to warn the driver of the potential threats that moving/stationary vehicles may generate. All the subsystems are updated and synchronised with each other in real-time to produce a seamless and smooth transition between frames, facilitating a precise and immersive driving experience for anyone. The high accuracy and robustness of the proposed system makes it a versatile component, which can be integrated in variety of applications for enhancing a person's reality perception. Copyright © 2020 Inderscience Enterprises Ltd.
引用
收藏
页码:114 / 130
页数:16
相关论文
共 30 条
[1]  
Abbink D.A., Mulder M., Exploring the dimensions of haptic feedback support in manual control, Journal of Computing and Information Science in Engineering, 9, pp. 0110061-0110069, (2009)
[2]  
Azzi S., Reymond G., Merienne F., Kemeny A., Eco-driving performance assessment with in-car visual and haptic feedback assistance, Journal Computing Information Science Engineering, 11, 4, pp. 0410051-0410055, (2011)
[3]  
Brooks F.P., What's real about virtual reality?, IEEE Computer Graphics and Applications, 19, 6, pp. 16-27, (1999)
[4]  
Chandler T., Cordeil M., Czauderna T., Dwyer T., Glowacki J., Goncu C., Klapperstueck M., Klein K., Marriott K., Schreiber F., Wilson E., Immersive analytics, Big Data Visual Analytics (BDVA), pp. 1-8, (2015)
[5]  
Global, regional, and national age, sex specific all-cause and cause-specific mortality for 240 causes of death, 1990-2013: A systematic analysis for the Global Burden of Disease Study 2013, The Lancet, 385, pp. 117-171, (2015)
[6]  
Gossmann J., Hackbarth B., West R., Margolis T., Lewis J.P., Mostafavi I., Scalable auditory data signatures for discovery oriented browsing in expressive context, Proceedings of the 14th International Conference on Auditory Display, (2008)
[7]  
Hackathorn R., Franks B., Create An Augmented Reality, (2012)
[8]  
Hale K.S., Stanney K.M., Deriving haptic design guidelines from human physiological, psychophysical, and neurological foundations, IEEE Computer Graphics and Applications, 24, 2, pp. 33-39, (2004)
[9]  
Hannaford B., Okamura A.M., Haptics, Springer Handbook of Robotics, 2nd Ed., pp. 1063-1084, (2016)
[10]  
Jain A., Koppula H.S., Raghavan B., Soh S., Saxena A., Car that knows before you do: Anticipating maneuvers via learning temporal driving models, Proceedings of the IEEE International Conference on Computer Vision, pp. 3182-3190, (2015)