[DEMO] User Friedly Calibration and Tracking for Optical Stereo See-Through Augmented Reality

被引:0
|
作者
Wientapper, Folker [1 ]
Engelke, Timo [1 ]
Keil, Jens [1 ]
Wuest, Harald [1 ]
Mensik, Johanna [1 ]
机构
[1] Fraunhofer IGD, Darmstadt, Germany
来源
2014 IEEE INTERNATIONAL SYMPOSIUM ON MIXED AND AUGMENTED REALITY (ISMAR) - SCIENCE AND TECHNOLOGY | 2014年
关键词
H.5.1 [Information Interfaces and Presentation]: Multimedia Information Systems Artificial; augmented; and virtual realities; I.4.1 [Image Processing and Computer Vision]: Digitization and Image Capture-Camera calibration; I.4.8 [Image Processing and Computer Vision]: Scene Analysis-Tracking; G.1.6 [Numerical Analysis]: Optimization-Least squares methods;
D O I
暂无
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Optical see through head mounted displays (OST-HMD) are ever since the first days of Augmented Reality (AR) in focus of development and in nowadays first affordable and prototypes are spread out to markets. Despite common technical problems, such as having a proper field of view, weight, and other problems concerning the miniaturization of these systems, a crucial aspect for AR relies also in the calibration of such a device with respect to the individual user for proper alignment of augmentations. Our demonstrator shows a practical solution for this problem along with a fully featured example application for a typical maintenance use case based on a generalized framework for application creation. We depict the technical background and procedure of the calibration, the tracking approach considering the sensors of the device, user experience factors, and its implementation procedure in general. We present our demonstrator using an Epson Moverio BT-200 OST-HMD.
引用
收藏
页码:385 / 386
页数:2
相关论文
共 50 条
  • [1] User Friendly Calibration for Tracking of Optical Stereo See-Through Head Worn Displays for Augmented Reality
    Bernard, Felix
    Engelke, Timo
    Kuijper, Arjan
    2017 INTERNATIONAL CONFERENCE ON CYBERWORLDS (CW), 2017, : 33 - 40
  • [2] Optical see-through HMD calibration: A stereo method validated with a video see-through system
    Genc, Y
    Sauer, F
    Wenzel, F
    Tuceryan, M
    Navab, N
    IEEE AND ACM INTERNATIONAL SYMPOSIUM ON AUGMENTED REALITY, PROCEEDING, 2000, : 165 - 174
  • [3] Perceived Transparency in Optical See-Through Augmented Reality
    Zhang, Lili
    Murdoch, Michael J.
    2021 IEEE INTERNATIONAL SYMPOSIUM ON MIXED AND AUGMENTED REALITY ADJUNCT PROCEEDINGS (ISMAR-ADJUNCT 2021), 2021, : 115 - 120
  • [4] Brightness matching in optical see-through augmented reality
    Murdoch, Michael J.
    JOURNAL OF THE OPTICAL SOCIETY OF AMERICA A-OPTICS IMAGE SCIENCE AND VISION, 2020, 37 (12) : 1927 - 1936
  • [5] Assessment of Optical See-Through Head Mounted Display Calibration for Interactive Augmented Reality
    Ballestin, Giorgio
    Chessa, Manuela
    Solari, Fabio
    2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION WORKSHOPS (ICCVW), 2019, : 4452 - 4460
  • [6] Tracking registration of optical see-through augmented reality based on the Riemannian manifold constraint
    An, Zhe
    Liu, Yang
    OPTICS EXPRESS, 2022, 30 (26): : 46418 - 46434
  • [7] Egocentric depth judgments in optical, see-through augmented reality
    Swan, J. Edward, II
    Jones, Adam
    Kolstad, Eric
    Livingston, Mark A.
    Smallman, Harvey S.
    IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS, 2007, 13 (03) : 429 - 442
  • [8] User Evaluation of See-Through Vision for Mobile Outdoor Augmented Reality
    Avery, Benjamin
    Thomas, Bruce H.
    Piekarski, Wayne
    7TH IEEE INTERNATIONAL SYMPOSIUM ON MIXED AND AUGMENTED REALITY 2008, PROCEEDINGS, 2008, : 69 - 72
  • [9] ASTOR: An autostereoscopic optical see-through augmented reality system
    Olwal, A
    Lindfors, C
    Gustafsson, J
    Kjellberg, T
    Mattsson, L
    INTERNATIONAL SYMPOSIUM ON MIXED AND AUGMENTED REALITY, PROCEEDINGS, 2005, : 24 - 27
  • [10] Investigating color appearance in optical see-through augmented reality
    Hassani, Nargess
    Murdoch, Michael J.
    COLOR RESEARCH AND APPLICATION, 2019, 44 (04): : 492 - 507