Discriminative Metric Learning with Deep Forest

被引:11
作者
Utkin, Lev, V [1 ]
Ryabinin, Mikhail A. [1 ]
机构
[1] Peter Great St Petersburg Polytech Univ SPbPU, St Petersburg, Russia
基金
俄罗斯科学基金会;
关键词
Classification; random forest; decision tree; deep learning; metric learning; quadratic programming;
D O I
10.1142/S0218213019500076
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
A Discriminative Deep Forest (DisDF) as a metric learning algorithm is proposed in the paper. It is based on the Deep Forest or gcForest proposed by Zhou and Feng and can be viewed as a gcForest modification. The case of the fully supervised learning is studied when the class labels of individual training examples are known. The main idea underlying the algorithm is to assign weights to decision trees in random forest in order to reduce distances between objects from the same class and to increase them between objects from different classes. The weights are training parameters. A specific objective function which combines Euclidean and Manhattan distances and simplifies the optimization problem for training the DisDF is proposed. The numerical experiments illustrate the proposed distance metric algorithm.
引用
收藏
页数:19
相关论文
共 31 条
[1]  
[Anonymous], ARXIV170208835V2
[2]  
[Anonymous], 2006, TECH REP
[3]  
[Anonymous], ARXIV170507366
[4]  
[Anonymous], ARXIV160407866
[5]  
Bellet A., 2013, ARXIV13066709
[6]  
Berlemont Samuel, 2015, 2015 11th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG), P1, DOI 10.1109/FG.2015.7163112
[7]  
Bertinetto L., 2016, ARXIV160609549V2
[8]  
Bromley J., 1993, International Journal of Pattern Recognition and Artificial Intelligence, V7, P669, DOI 10.1142/S0218001493000339
[9]  
Capitaine H. L., 2016, ARXIV161204853V1
[10]  
Chen K., 2011, Advances in Neural Information Processing Systems, V24, P298