共 21 条
Improved Neural Network using Integral-RELU based Prevention Activation for Face Detection
被引:0
作者:
Kirana, Kartika Candra
[1
]
Wibawanto, Slamet
[1
]
Hidayah, Nur
[2
]
Cahyono, Gigih Prasetyo
[3
]
Asfani, Khoirudin
[4
]
机构:
[1] Univ Negeri Malang, Dept Elect Engn, Malang, Indonesia
[2] Univ Negeri Malang, Dept Guidance & Counseling, Malang, Indonesia
[3] Visionet Data Int, Software Engn, Malang, Indonesia
[4] Univ Negeri Malang, Elect Engn, Malang, Indonesia
来源:
2019 INTERNATIONAL CONFERENCE ON ELECTRICAL, ELECTRONICS AND INFORMATION ENGINEERING (ICEEIE)
|
2019年
关键词:
face detection;
convolution neural network;
ELU;
integral;
IMAGE;
D O I:
10.1109/iceeie47180.2019.8981443
中图分类号:
TM [电工技术];
TN [电子技术、通信技术];
学科分类号:
0808 ;
0809 ;
摘要:
Numerous variation of neural networks improved the face detection performance significantly. However, extravagant computation becomes a major problem found when the window explored the non-face candidate. To prevent useless detection in the non-face area, our proposed method is the addition of a prevention function using integral representation in the RELU (Rectified Linear Units) activation function. If the integral of RELU does not reach the threshold, the convolution is skipped and the window shifts to the neighboring area. Both functions are selected because they are easy to calculate and achieve convergence speed rapidly. Based on the results of trials on 10 data, the Integral CNN + Integral is faster than CNN + RELU with a speed ratio of 125: 285 FPS. Besides, 'RELU CNN + Integral' has fewer face detection redundancies compared to Viola & Jones algorithms. This result shows that our proposed method is superior to the state-of-the-art algorithms.
引用
收藏
页码:260 / 263
页数:4
相关论文