Predicting Eyes' Fixations in Movie Videos: Visual Saliency Experiments on a New Eye-Tracking Database

被引:0
|
作者
Koutras, Petros [1 ]
Katsamanis, Athanasios [1 ]
Maragos, Petros [1 ]
机构
[1] Natl Tech Univ Athens, Sch Elect & Comp Engn, GR-15773 Athens, Greece
关键词
Eye-tracking Database; Visual Saliency; Spatio-Temporal Visual Frontend; 3D Gabor Filters; Lab Color Space; MODEL; ATTENTION;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper we describe the newly created eye tracking annotated database Eye-Tracking Movie Database ETMD and give some preliminary experimental results on this dataset using our new visual saliency frontend. We have developed a database with eye-tracking human annotation that comprises video clips from Hollywood movies, which are longer in duration than the existing databases' videos and include more complex semantics. Our proposed visual saliency frontend is based on both low-level features, such as intensity, color and spatio-temporal energy, and face detection results and provides a single saliency volume map. The described new eye-tracking database can become useful in many applications while our computational frontend shows to be promising as it gave good results on predicting the eye's fixation according to certain metrics.
引用
收藏
页码:183 / 194
页数:12
相关论文
共 25 条
  • [1] Visual saliency in captioned digital videos and learning of English collocations: An eye-tracking study
    Choi, Sungmook
    LANGUAGE LEARNING & TECHNOLOGY, 2023, 27 (01): : 28 - 28
  • [2] An elaborate algorithm for automatic processing of eye movement data and identifying fixations in eye-tracking experiments
    Liu, Bo
    Zhao, Qi-Chao
    Ren, Yuan-Yuan
    Wang, Qing-Ju
    Zheng, Xue-Lian
    ADVANCES IN MECHANICAL ENGINEERING, 2018, 10 (05)
  • [3] An interactive eye-tracking system for measuring radiologists' visual fixations in volumetric CT images: Implementation and initial eye-tracking accuracy validation
    Gong, Hao
    Hsieh, Scott S.
    Holmes, David R., III
    Cook, David A.
    Inoue, Akitoshi
    Bartlett, David J.
    Baffour, Francis
    Takahashi, Hiroaki
    Leng, Shuai
    Yu, Lifeng
    McCollough, Cynthia H.
    Fletcher, Joel G.
    MEDICAL PHYSICS, 2021, 48 (11) : 6710 - 6723
  • [4] All eyes on the signal? - Mapping cohesive discourse structures with eye-tracking data of explanation videos
    Thiele, Leandra
    Schmidt-Borcherding, Florian
    Bateman, John A.
    FRONTIERS IN COMMUNICATION, 2024, 9
  • [5] Audio-visual interactive evaluation of the forest landscape based on eye-tracking experiments
    Liu, Yiping
    Hu, Mengjun
    Zhao, Bing
    URBAN FORESTRY & URBAN GREENING, 2019, 46
  • [6] Influencer marketing in the eyes of young adults: An eye-tracking study into visual content strategies on Instagram
    de Vooght, Edward
    De Veirman, Marijke
    TIJDSCHRIFT VOOR COMMUNICATIEWETENSCHAP, 2023, 51 (03): : 236 - 260
  • [7] Eye-tracking as a new method to assess visual function in children with cerebral visual impairment
    Vermaak, M.
    Manders, J.
    van der Geest, J.
    Van der Steen, H.
    Evenhuis, H. M.
    JOURNAL OF INTELLECTUAL DISABILITY RESEARCH, 2008, 52 : 673 - 673
  • [8] Visual time period analysis: a multimedia analytics application for summarizing and analyzing eye-tracking experiments
    Del Fatto, Vincenzo
    Dignos, Anton
    Raimato, Guerriero
    Maccioni, Lorenzo
    Borgianni, Yuri
    Gamper, Johann
    MULTIMEDIA TOOLS AND APPLICATIONS, 2019, 78 (23) : 32779 - 32804
  • [9] Visual time period analysis: a multimedia analytics application for summarizing and analyzing eye-tracking experiments
    Vincenzo Del Fatto
    Anton Dignös
    Guerriero Raimato
    Lorenzo Maccioni
    Yuri Borgianni
    Johann Gamper
    Multimedia Tools and Applications, 2019, 78 : 32779 - 32804
  • [10] Benchmark three-dimensional eye-tracking dataset for visual saliency prediction on stereoscopic three-dimensional video
    Banitalebi-Dehkordi, Amin
    Nasiopoulos, Eleni
    Pourazad, Mahsa T.
    Nasiopoulos, Panos
    JOURNAL OF ELECTRONIC IMAGING, 2016, 25 (01)