A Dataset of Head and Eye Movements for 360° Videos

被引:140
作者
David, Erwan J. [1 ]
Gutierrez, Jesus [1 ]
Coutrot, Antoine [1 ]
Da Silva, Matthieu Perreira [1 ]
Le Callet, Patrick [1 ]
机构
[1] Univ Nantes, CNRS, UMR 6004, LS2N, Nantes, France
来源
PROCEEDINGS OF THE 9TH ACM MULTIMEDIA SYSTEMS CONFERENCE (MMSYS'18) | 2018年
关键词
Omnidirectional video; 360 degrees videos; dataset; eye-tracking; saliency; gaze behavior;
D O I
10.1145/3204949.3208139
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Research on visual attention in 360 degrees content is crucial to understand how people perceive and interact with this immersive type of content and to develop efficient techniques for processing, encoding, delivering and rendering. And also to offer a high quality of experience to end users. The availability of public datasets is essential to support and facilitate research activities of the community. Recently, some studies have been presented analyzing exploration behaviors of people watching 360 degrees videos, and a few datasets have been published. However, the majority of these works only consider head movements as proxy for gaze data, despite the importance of eye movements in the exploration of omnidirectional content. Thus, this paper presents a novel dataset of 360 degrees videos with associated eye and head movement data, which is a follow-up to our previous dataset for still images [14]. Head and eye tracking data was obtained from 57 participants during a free-viewing experiment with 19 videos. In addition, guidelines on how to obtain saliency maps and scanpaths from raw data are provided. Also, some statistics related to exploration behaviors are presented, such as the impact of the longitudinal starting position when watching omnidirectional videos was investigated in this test. This dataset and its associated code are made publicly available to support research on visual attention for 360 degrees content.
引用
收藏
页码:432 / 437
页数:6
相关论文
共 24 条
[1]  
[Anonymous], 2018, SIGNAL PROCESSING IM
[2]  
[Anonymous], 2000, P 2000 S EYE TRACK R, DOI [10.1145/355017.355028, DOI 10.1145/355017.355028]
[3]  
[Anonymous], P ACM MMSYS 17
[4]  
[Anonymous], [No title captured]
[5]  
Bylinskii Z., 2016, What do different evaluation metrics tell us about saliency models
[6]   360-Degree Video Head Movement Dataset [J].
Corbillon, Xavier ;
De Simone, Francesca ;
Simon, Gwendal .
PROCEEDINGS OF THE 8TH ACM MULTIMEDIA SYSTEMS CONFERENCE (MMSYS'17), 2017, :199-204
[7]   Viewport-Adaptive Navigable 360-Degree Video Delivery [J].
Corbillon, Xavier ;
Simon, Gwendal ;
Devlic, Alisa ;
Chakareski, Jacob .
2017 IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC), 2017,
[8]  
Duchowski Andrew, 2002, EUR 2002 SHORT PRES, DOI [10.2312/egs.20021022, DOI 10.2312/EGS.20021022]
[9]   Research on Parallel Processing Framework of Power Big Data [J].
Hu Bin ;
Luo Li-ming ;
Yang Pei ;
Huang Tai-gui ;
Zhang Li-ping .
2017 3RD INTERNATIONAL CONFERENCE ON COMPUTATIONAL SYSTEMS AND COMMUNICATIONS (ICCSC 2017), 2017, :1-7
[10]  
International Telecommunication Union, 2008, SUBJ VID QUAL ASS ME