Comparing multispectral image fusion methods for a target detection task

被引:12
作者
Lanir, Joel
Maltz, Masha
Rotman, Stanley R.
机构
[1] Ben Gurion Univ Negev, Dept Ind Engn & Management, IL-84804 Beer Sheva, Israel
[2] Ben Gurion Univ Negev, Dept Elect & Comp Engn, IL-84804 Beer Sheva, Israel
关键词
image fusion; multispectral imaging; visual search; eye movements; visual interface;
D O I
10.1117/1.2746248
中图分类号
O43 [光学];
学科分类号
070207 ; 0803 ;
摘要
Image fusion has gained importance with the advances in multispectral imaging. We examine four different fusion methods by comparing human observers' target detection performance with the resultant fused images. Three experiments with 89 participants were conducted. In the first experiment, images with multiple targets were presented to the participants. Quantitative measurements of participants' hit accuracy and reaction time were measured. In the second experiment, we implemented an approach that has not been generally used in the context of image fusion evaluation: we used the paired-comparison technique to qualitatively assess and scale the subjective value of the fusion methods. In the third experiment, participants' eye movements were recorded as the participants searched for targets. We introduce a novel method to compensate for eye-tracker precision limitations and to enable analysis of eye movement data of different image samples even for detection tasks with small targets. Results indicated that the false color and principal components fusion methods showed the best results over all experiments. (c) 2007 Society of Photo-Optical Instrumentation Engineers.
引用
收藏
页数:8
相关论文
共 24 条
[1]   Perceptual ability with real-world nighttime scenes: Image-intensified, infrared, and fused-color imagery [J].
Essock, EA ;
Sinai, MJ ;
McCarley, JS ;
Krebs, WK ;
DeFord, JK .
HUMAN FACTORS, 1999, 41 (03) :438-452
[2]   Saccade target selection during visual search [J].
Findlay, JM .
VISION RESEARCH, 1997, 37 (05) :617-631
[3]  
Gonzalez R.C., 1992, DIGITAL IMAGE PROCES
[4]   Psychophysical assessments of image-sensor fused imagery [J].
Krebs, WK ;
Sinai, MJ .
HUMAN FACTORS, 2002, 44 (02) :257-271
[5]   Comparing behavioral receiver operating characteristic curves to multidimensional matched filters [J].
Krebs, WK ;
Scribner, DA ;
McCarley, JS .
OPTICAL ENGINEERING, 2001, 40 (09) :1818-1826
[6]  
KREBS WK, 1999, P 43 ANN M HUM FACT, V43, P1333
[7]   Multisensor Fusion and Integration: Approaches, Applications, and Future Research Directions [J].
Luo, Ren C. ;
Yih, Chih-Chen ;
Su, Kuo Lan .
IEEE SENSORS JOURNAL, 2002, 2 (02) :107-119
[8]   GAZE SELECTS INFORMATIVE DETAILS WITHIN PICTURES [J].
MACKWORTH, NH ;
MORANDI, AJ .
PERCEPTION & PSYCHOPHYSICS, 1967, 2 (11) :547-547
[9]   Eye movements of younger and older drivers [J].
Maltz, M ;
Shinar, D .
HUMAN FACTORS, 1999, 41 (01) :15-25
[10]   Visibility of road hazards in thermal, visible, and sensor-fused night-time imagery [J].
McCarley, JS ;
Krebs, WK .
APPLIED ERGONOMICS, 2000, 31 (05) :523-530