Roboscan: a combined 2D and 3D vision system for improved speed and flexibility in pick-and-place operation

被引:0
|
作者
Paolo Bellandi
Franco Docchio
Giovanna Sansoni
机构
[1] University of Brescia,Laboratory of Optoelectronics
来源
The International Journal of Advanced Manufacturing Technology | 2013年 / 69卷
关键词
Machine vision; Robotic manipulation; Pattern matching; 2D and 3D calibration; Blob analysis; 3D segmentation;
D O I
暂无
中图分类号
学科分类号
摘要
We describe Roboscan, a Robot cell that combines 2D and 3D vision in a simple device, to aid a Robot manipulator in pick-and-place operations in a fast and accurate way. The optical head of Roboscan combines the two vision systems: the camera is used “stand-alone” in the 2D system, and combined to a laser slit projector in the 3D system, which operates in the triangulation mode. The 2D system, using suitable libraries, provides the preliminary 2D information to the 3D system to perform in a very fast, flexible and robust way the point cloud segmentation and fitting. Roboscan is mounted onto an anthropomorphic, 6-DOF Robot manipulator. The most innovative part of the system is represented by the use of robust 2D geometric template matching as a means to classify 3D objects. In this way, we avoid time-consuming 3D point cloud segmentation and 3D object classification, using 3D data only for estimating pose and orientation of the robot gripper. In addition, a novel approach to the template definition in the 2D geometric template matching is proposed, where the influence of surface reflectance and colour of the objects over the definition of the template geometry is minimized. We describe the procedures for 2D and 3D vision of Roboscan, together with the calibration procedures that have been implemented. We also present a set of tests that show the performance of the system and its effectiveness in a number of pick-and-place operations.
引用
收藏
页码:1873 / 1886
页数:13
相关论文
共 50 条
  • [1] Roboscan: a combined 2D and 3D vision system for improved speed and flexibility in pick-and-place operation
    Bellandi, Paolo
    Docchio, Franco
    Sansoni, Giovanna
    INTERNATIONAL JOURNAL OF ADVANCED MANUFACTURING TECHNOLOGY, 2013, 69 (5-8): : 1873 - 1886
  • [2] TINA - A 3D VISION SYSTEM FOR PICK AND PLACE
    PORRILL, J
    POLLARD, SB
    PRIDMORE, TP
    BOWEN, JB
    MAYHEW, JEW
    FRISBY, JP
    IMAGE AND VISION COMPUTING, 1988, 6 (02) : 91 - 99
  • [3] MULTIPROCESSOR 3D VISION SYSTEM FOR PICK AND PLACE
    RYGOL, M
    POLLARD, S
    BROWN, C
    IMAGE AND VISION COMPUTING, 1991, 9 (01) : 33 - 38
  • [4] 3D Vision-guided Pick-and-Place Using Kuka LBR iiwa Robot
    Niu, Hanlin
    Ji, Ze
    Zhu, Zihang
    Yin, Hujun
    Carrasco, Joaquin
    2021 IEEE/SICE INTERNATIONAL SYMPOSIUM ON SYSTEM INTEGRATION (SII), 2021, : 592 - 593
  • [5] 2D vision in a 3D System
    Flinders, M
    INTELLIGENT ROBOTS AND COMPUTER VISION XX: ALGORITHMS, TECHNIQUES, AND ACTIVE VISION, 2001, 4572 : 422 - 426
  • [6] Pick-and-place process sequencing for transformation of rasterized 3D structures
    Pan, Wei
    Chen, Lujie
    Dritsas, Stylianos
    AUTOMATION IN CONSTRUCTION, 2017, 75 : 56 - 64
  • [7] Fabrication of 3D quantum optical devices by pick-and-place forming
    Miyazaki, H
    Sato, T
    NINTH ANNUAL INTERNATIONAL WORKSHOP ON MICRO ELECTRO MECHANICAL SYSTEMS, IEEE PROCEEDINGS: AN INVESTIGATION OF MICRO STRUCTURES, SENSORS, ACTUATORS, MACHINES AND SYSTEMS, 1996, : 318 - 324
  • [8] Conceptual Design of a Pick-and-Place 3D Nanoprinter for Materials Synthesis
    Carlson, Max B.
    Yau, Kayen K.
    Simpson, Robert E.
    Short, Michael P.
    3D PRINTING AND ADDITIVE MANUFACTURING, 2015, 2 (03) : 123 - 130
  • [9] Model-Based 3D Pose Estimation for Pick-and-Place Application
    Liang, Shih-Cheng
    Lin, Huei-Yung
    Chang, Chin-Chen
    PROCEEDINGS OF THE FIFTEENTH IAPR INTERNATIONAL CONFERENCE ON MACHINE VISION APPLICATIONS - MVA2017, 2017, : 412 - 415
  • [10] Pick and Place of Large Object Based on 3D Vision
    Wu, Hsien-Huang
    Xie, Jia-Kun
    PROCEEDINGS OF THE 2020 INTERNATIONAL CONFERENCE ON ARTIFICIAL LIFE AND ROBOTICS (ICAROB2020), 2020, : 143 - 146