An investigation of camera movements and capture techniques on optical flow for real-time rendering and presentation

被引:1
作者
Modi, Nishant [1 ]
Ramakrishna, M. [2 ]
机构
[1] Manipal Acad Higher Educ, Manipal Inst Technol, Dept Informat & Commun Technol, Manipal, Karnataka, India
[2] Manipal Acad Higher Educ, Manipal Inst Technol, Dept Data Sci & Comp Applicat, Manipal, Karnataka, India
关键词
Optical flow; Camera motion; Motion tracking; Object detection;
D O I
10.1007/s11554-023-01322-7
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
New and interesting uses for portable devices include the creation and viewing of 3D models and 360-degree photos of real landscapes. To provide a 3D model and a 360-degree view of a scenario, these apps search for real-time rendering and presentation. This study examines the impact of real-time image processing on movements in camera view and the application of optical flow algorithms. The optical flows are affected by how the objects involved motion. The way the image is recorded and the camera movements affect the relative motion. As a consequence, camera motions are responsive to optical flow algorithms. To record an image, the camera may pan around, move in a straight path, or move randomly. We have captured datasets of videos produced in different contexts to better replicate real-world scenarios. Each dataset was captured in a variety of illumination situations, camera movements, and indoor and outdoor recording sites. Here, to determine the most effective optical flow algorithms for use in near real-time applications, such as augmented reality and virtual reality, we compare results based on quality and processing delay for each video frame. We conducted a comparison study to better understand how random camera motion affects real-time video processing. These methods can be used to handle a variety of real-world issues, such as object tracking, video segmentation, structure from motion, gesture tracking, and so on.
引用
收藏
页数:15
相关论文
共 30 条
[1]  
Alldieck T, 2017, Arxiv, DOI arXiv:1703.00177
[2]  
Almeida J, 2009, LECT NOTES COMPUT SC, V5875, P435, DOI 10.1007/978-3-642-10331-5_41
[3]  
[Anonymous], 2015, 2015 9 INT C SIGNAL, DOI DOI 10.1109/ICSPCS.2015.7391778
[4]  
Bouguet J.-Y., 2001, Intel corporation, V5, P1, DOI DOI 10.1109/ICETET.2009.154
[5]   Deep Learning for Robust Normal Estimation in Unstructured Point Clouds [J].
Boulch, Alexandre ;
Marlet, Renaud .
COMPUTER GRAPHICS FORUM, 2016, 35 (05) :281-290
[6]   A Naturalistic Open Source Movie for Optical Flow Evaluation [J].
Butler, Daniel J. ;
Wulff, Jonas ;
Stanley, Garrett B. ;
Black, Michael J. .
COMPUTER VISION - ECCV 2012, PT VI, 2012, 7577 :611-625
[7]   Matching optical flow to motor speed in virtual reality while running on a treadmill [J].
Caramenti, Martina ;
Lafortuna, Claudio L. ;
Mugellini, Elena ;
Abou Khaled, Omar ;
Bresciani, Jean-Pierre ;
Dubois, Amandine .
PLOS ONE, 2018, 13 (04)
[8]   Two-frame motion estimation based on polynomial expansion [J].
Farnebäck, G .
IMAGE ANALYSIS, PROCEEDINGS, 2003, 2749 :363-370
[9]   Optical flow modeling and computation: A survey [J].
Fortun, Denis ;
Bouthemy, Patrick ;
Kervrann, Charles .
COMPUTER VISION AND IMAGE UNDERSTANDING, 2015, 134 :1-21
[10]   OPTIC FLOW [J].
KOENDERINK, JJ .
VISION RESEARCH, 1986, 26 (01) :161-179