Decoupling of Real and Digital Content in Projection-based Augmented Reality Systems Using Time Multiplexed Image Capture

被引:5
作者
Soomro, Shoaib R. [1 ]
Ulusoy, Erdem [1 ]
Urey, Hakan [1 ]
机构
[1] Koc Univ, Dept Elect Engn, Opt Microsys Lab, TR-34450 Istanbul, Turkey
基金
欧洲研究理事会;
关键词
D O I
10.2352/J.ImagingSci.Technol.2017.61.1.010406
中图分类号
TB8 [摄影技术];
学科分类号
0804 ;
摘要
Projection-based augmented reality systems overlay digital information directly on real objects, while at the same time use cameras to capture the scene information. A common problem with such systems is that cameras see the projected image besides the real objects to some degree. This crosstalk reduces the object detection and digital content registration abilities. The authors propose a novel time sharing-based technique that facilitates the real and digital content decoupling in real time without crosstalk. The proposed technique is based on time sequential operation between a MEMS scanner-based mobile projector and rolling shutter image sensor. A MEMS mirror-based projector scans light beam in raster pattern pixel by pixel and completes full frame projection over a refresh period, while a rolling shutter image sensor sequentially collects scene light row by row. In the proposed technique, the image sensor is synchronized with scanning MEMS mirror and precisely follows the display scanner with a half-period lag to make the displayed content completely invisible for camera. An experimental setup consisting of laser pico projector, an image sensor, and a delay and amplifier circuit is developed. The performance of proposed technique is evaluated by measuring the crosstalk in captured content and sensor exposure limit. The results show 0% crosstalk in captured content up to 8 ms sensor exposure. High capture frame rate (up to 45 fps) is achieved by cyclically triggering a 3.2 MP, 60 fps CMOS sensor and using a 60 Hz pico projector. (C) 2017 Society for Imaging Science and Technology.
引用
收藏
页数:6
相关论文
共 23 条
  • [1] BAJURA M, 1995, VIRTUAL REALITY ANNUAL INTERNATIONAL SYMPOSIUM '95, PROCEEDINGS, P189
  • [2] Benko Hrvoje., 2012, P SIGCHI C HUMAN FAC, P199, DOI DOI 10.1145/2207676.2207704
  • [3] Bradley D, 2009, PROC CVPR IEEE, P540, DOI 10.1109/CVPR.2009.5204340
  • [4] Fernandez S, 2011, INT SYMP IMAGE SIG, P633
  • [5] Head Mounted Projection Display & Visual Attention: Visual attentional processing of head referenced static and dynamic displays while in motion and standing
    Genc, Caglar
    Soomro, Shoaib
    Duyan, Yalcin
    Olcer, Selim
    Balci, Fuat
    Urey, Hakan
    Ozcan, Oguzhan
    [J]. 34TH ANNUAL CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS, CHI 2016, 2016, : 1538 - 1547
  • [6] Hua H, 2015, OPT PHOTONICS NEWS, V26, P26, DOI 10.1364/OPN.26.2.000026
  • [7] Huber J., 2012, P 2012 ACM ANN C HUM, P2513
  • [8] Head-mounted mixed reality projection display for games production and entertainment
    Kade, Daniel
    Aksit, Kaan
    Urey, Hakan
    Ozcan, Oguzhan
    [J]. PERSONAL AND UBIQUITOUS COMPUTING, 2015, 19 (3-4) : 509 - 521
  • [9] Kim JA, 2014, PLOS ONE, V9, DOI [10.1371/journal.pone.0085445, 10.1371/journal.pone.0091940]
  • [10] Augmented reality using personal projection and retroreflection
    Krum, David M.
    Suma, Evan A.
    Bolas, Mark
    [J]. PERSONAL AND UBIQUITOUS COMPUTING, 2012, 16 (01) : 17 - 26