Altitude control in honeybees: joint vision-based learning and guidance

被引:15
作者
Portelli, Geoffrey [1 ,2 ]
Serres, Julien R. [1 ]
Ruffier, Franck [1 ]
机构
[1] Aix Marseille Univ, CNRS, ISM, Marseille, France
[2] Univ Cote Azur, CNRS, I3S, Sophia Antipolis, France
来源
SCIENTIFIC REPORTS | 2017年 / 7卷
关键词
VISUAL CONTROL; FLIGHT SPEED; OPTIC LOBE; NAVIGATION; HEIGHT; MOTION;
D O I
10.1038/s41598-017-09112-5
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
Studies on insects' visual guidance systems have shed little light on how learning contributes to insects' altitude control system. In this study, honeybees were trained to fly along a double-roofed tunnel after entering it near either the ceiling or the floor of the tunnel. The honeybees trained to hug the ceiling therefore encountered a sudden change in the tunnel configuration midways: i.e. a "dorsal ditch". Thus, the trained honeybees met a sudden increase in the distance to the ceiling, corresponding to a sudden strong change in the visual cues available in their dorsal field of view. Honeybees reacted by rising quickly and hugging the new, higher ceiling, keeping a similar forward speed, distance to the ceiling and dorsal optic flow to those observed during the training step; whereas bees trained to follow the floor kept on following the floor regardless of the change in the ceiling height. When trained honeybees entered the tunnel via the other entry (the lower or upper entry) to that used during the training step, they quickly changed their altitude and hugged the surface they had previously learned to follow. These findings clearly show that trained honeybees control their altitude based on visual cues memorized during training. The memorized visual cues generated by the surfaces followed form a complex optic flow pattern: trained honeybees may attempt to match the visual cues they perceive with this memorized optic flow pattern by controlling their altitude.
引用
收藏
页数:10
相关论文
共 50 条
  • [21] Vision-Based Traffic Conflict Detection Using Trajectory Learning and Prediction
    Sun, Zongyuan
    Chen, Yuren
    Wang, Pin
    Fang, Shouen
    Tang, Boming
    IEEE ACCESS, 2021, 9 : 34558 - 34569
  • [22] VIOLA: Imitation Learning for Vision-Based Manipulation with Object Proposal Priors
    Zhu, Yifeng
    Joshi, Abhishek
    Stone, Peter
    Zhu, Yuke
    CONFERENCE ON ROBOT LEARNING, VOL 205, 2022, 205 : 1199 - 1210
  • [23] Vision-based Mobile Robot Map Building and Environment Fuzzy Learning
    Al Muteb, Khaled
    PROCEEDINGS FIFTH INTERNATIONAL CONFERENCE ON INTELLIGENT SYSTEMS, MODELLING AND SIMULATION, 2014, : 43 - 48
  • [24] Fast, Intuitive, Vision-Based: Performance Metrics for Visual Registration, Instrument Guidance, and Image Fusion
    Basafa, Ehsan
    Hossbach, Martin
    Stolka, Philipp J.
    CLINICAL IMAGE-BASED PROCEDURES: TRANSLATIONAL RESEARCH IN MEDICAL IMAGING, 2016, 9958 : 9 - 17
  • [25] Computer Vision-Based Guidance Assistance Concept for Plowing Using RGB-D Camera
    Turkoz, Erkin
    Olcay, Ertug
    Oksanen, Timo
    2021 IEEE INTERNATIONAL CONFERENCE ON IMAGING SYSTEMS AND TECHNIQUES (IST), 2021,
  • [26] Efficient vision-based navigation
    Hornung, Armin
    Bennewitz, Maren
    Strasdat, Hauke
    AUTONOMOUS ROBOTS, 2010, 29 (02) : 137 - 149
  • [27] Vision-based control for helicopter ship landing with handling qualities constraints
    Quang Huy Truong
    Rakotomamonjy, Thomas
    Taghizad, Armin
    Blannic, Jean-Marc
    IFAC PAPERSONLINE, 2016, 49 (17): : 118 - 123
  • [28] Vision-Based Distributed Formation Control Without an External Positioning System
    Montijano, Eduardo
    Cristofalo, Eric
    Zhou, Dingjiang
    Schwager, Mac
    Saguees, Carlos
    IEEE TRANSACTIONS ON ROBOTICS, 2016, 32 (02) : 339 - 351
  • [29] Adaptive intelligent vision-based control of a flexible-link manipulator
    Sahu, Umesh Kumar
    Patra, Dipti
    Subudhi, Bidyadhar
    ELECTRICAL ENGINEERING, 2023, 105 (5) : 3263 - 3281
  • [30] Micro object handling under SEM by vision-based automatic control
    Kasaya, T
    Miyazaki, H
    Saito, S
    Sato, T
    MICROROBOTICS AND MICROMANIPULATION, 1998, 3519 : 181 - 192