Gaze-following and joint visual attention in nonhuman animals

被引:42
作者
Itakura, S [1 ]
机构
[1] Kyoto Univ, Dept Psychol, Grad Sch Letters, Sakyo Ku, Kyoto 6068501, Japan
关键词
gaze-following; joint visual attention; theory of mind; nonhuman animal;
D O I
10.1111/j.1468-5584.2004.00253.x
中图分类号
B84 [心理学];
学科分类号
04 ; 0402 ;
摘要
In this paper, studies of gaze-following and joint visual attention in nonhuman animals are reviewed from the theoretical perspective of Emery (2000). There are many studies of gaze-following and joint visual attention in nonhuman primates. The reports concern not only adult individuals but also the development of these abilities. Studies to date suggest that monkeys and apes are able to follow the gaze of others, but only apes can understand the seeing-knowing relationship with regards to conspecifics in competitive situations. Also, there have recently been some reports of ability to follow the gaze of humans in domestic animals, such as dogs or horses, interacting with humans. These domestic animals are considered to have acquired this ability during their long history of selective breeding by humans. However, we need to clarify social gaze parameters in various species to improve our knowledge of the evolution of how we process others' gazing, attention, and mental states.
引用
收藏
页码:216 / 226
页数:11
相关论文
共 50 条
  • [41] Human ostensive signals do not enhance gaze following in chimpanzees, but do enhance object-oriented attention
    Fumihiro Kano
    Richard Moore
    Christopher Krupenye
    Satoshi Hirata
    Masaki Tomonaga
    Josep Call
    Animal Cognition, 2018, 21 : 715 - 728
  • [42] Human ostensive signals do not enhance gaze following in chimpanzees, but do enhance object-oriented attention
    Kano, Fumihiro
    Moore, Richard
    Krupenye, Christopher
    Hirata, Satoshi
    Tomonaga, Masaki
    Call, Josep
    ANIMAL COGNITION, 2018, 21 (05) : 715 - 728
  • [43] Robust Joint Visual Attention for HRI Using a Laser Pointer for Perspective Alignment and Deictic Referring
    Maravall, Dario
    de Lope, Javier
    Pablo Fuentes, Juan
    BIOMEDICAL APPLICATIONS BASED ON NATURAL AND ARTIFICIAL COMPUTING, PT II, 2017, 10338 : 127 - 136
  • [45] How do children learn to follow gaze, share joint attention, imitate their teachers, and use tools during social interactions?
    Grossberg, Stephen
    Vladusich, Tony
    NEURAL NETWORKS, 2010, 23 (8-9) : 940 - 965
  • [46] Patterns of gaze behavior during an eye-tracking measure of joint attention in typically developing children and children with autism spectrum disorder
    Swanson, Meghan R.
    Siller, Michael
    RESEARCH IN AUTISM SPECTRUM DISORDERS, 2013, 7 (09) : 1087 - 1096
  • [47] Look at Grandma! Joint visual attention over video chat during the COVID-19 pandemic
    Myers, Lauren J.
    Strouse, Gabrielle A.
    McClure, Elisabeth R.
    Keller, Krystyna R.
    Neely, Lucinda I.
    Stoto, Isabella
    Vadakattu, Nithya S.
    Kim, Erin D.
    Troseth, Georgene L.
    Barr, Rachel
    Zosh, Jennifer M.
    INFANT BEHAVIOR & DEVELOPMENT, 2024, 75
  • [48] Special Issue "Neurodevelopmental Neurodiversity": Research Report Visual attention patterns during a gaze following task in neurogenetic syndromes associated with unique profiles of autistic traits: Fragile X and Cornelia de Lange syndromes
    Ellis, Katherine
    White, Sarah
    Dziwisz, Malwina
    Agarwal, Paridhi
    Moss, Jo
    CORTEX, 2024, 174 : 110 - 124
  • [49] Leveraging mobile eye-trackers to capture joint visual attention in co-located collaborative learning groups
    Bertrand Schneider
    Kshitij Sharma
    Sebastien Cuendet
    Guillaume Zufferey
    Pierre Dillenbourg
    Roy Pea
    International Journal of Computer-Supported Collaborative Learning, 2018, 13 : 241 - 261
  • [50] Leveraging mobile eye-trackers to capture joint visual attention in co-located collaborative learning groups
    Schneider, Bertrand
    Sharma, Kshitij
    Cuendet, Sebastien
    Zufferey, Guillaume
    Dillenbourg, Pierre
    Pea, Roy
    INTERNATIONAL JOURNAL OF COMPUTER-SUPPORTED COLLABORATIVE LEARNING, 2018, 13 (03) : 241 - 261