Dimension Reduction Forests: Local Variable Importance Using Structured Random Forests

被引:5
作者
Loyal, Joshua Daniel [1 ]
Zhu, Ruoqing [1 ]
Cui, Yifan [2 ]
Zhang, Xin [3 ]
机构
[1] Univ Illinois, Dept Stat, Champaign, IL 61820 USA
[2] Natl Univ Singapore, Dept Stat & Data Sci, Singapore, Singapore
[3] Florida State Univ, Dept Stat, Tallahassee, FL 32306 USA
关键词
Random forests; Sufficient dimension reduction; Variable importance; SLICED INVERSE REGRESSION;
D O I
10.1080/10618600.2022.2069777
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Random forests are one of the most popular machine learning methods due to their accuracy and variable importance assessment. However, random forests only provide variable importance in a global sense. There is an increasing need for such assessments at a local level, motivated by applications in personalized medicine, policy-making, and bioinformatics. We propose a new nonparametric estimator that pairs the flexible random forest kernel with local sufficient dimension reduction to adapt to a regression function's local structure. This allows us to estimate a meaningful directional local variable importance measure at each prediction point. We develop a computationally efficient fitting procedure and provide sufficient conditions for the recovery of the splitting directions. We demonstrate significant accuracy gains of our proposed estimator over competing methods on simulated and real regression problems. Finally, we apply the proposed method to seasonal particulate matter concentration data collected in Beijing, China, which yields meaningful local importance measures. The methods presented here are available in the drforest Python package. for this article are available online.
引用
收藏
页码:1104 / 1113
页数:10
相关论文
共 50 条
[31]   LANGUAGE MODEL ADAPTATION USING RANDOM FORESTS [J].
Deoras, Anoop ;
Jelinek, Frederick ;
Su, Yi .
2010 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2010, :5198-5201
[32]   Segmentation of PMSE Data Using Random Forests [J].
Jozwicki, Dorota ;
Sharma, Puneet ;
Mann, Ingrid ;
Hoppe, Ulf-Peter .
REMOTE SENSING, 2022, 14 (13)
[33]   LEARNING LOCAL AND DEEP FEATURES FOR EFFICIENT CELL IMAGE CLASSIFICATION USING RANDOM FORESTS [J].
Oraibi, Zakariya A. ;
Yousif, Hayder ;
Hafiane, Adel ;
Seetharaman, Guna ;
Palaniappan, Kannappan .
2018 25TH IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2018, :2446-2450
[34]   Quantifying local and background contributions to PM10 concentrations in Haute-Normandie, using random forests [J].
Bobbia, Michel ;
Jollois, Francois-Xavier ;
Poggi, Jean-Michel ;
Portier, Bruno .
ENVIRONMETRICS, 2011, 22 (06) :758-768
[36]   Unbiased split variable selection for random survival forests using maximally selected rank statistics [J].
Wright, Marvin N. ;
Dankowski, Theresa ;
Ziegler, Andreas .
STATISTICS IN MEDICINE, 2017, 36 (08) :1272-1284
[37]   Random forests based monitoring of human larynx using questionnaire data [J].
Bacauskiene, M. ;
Verikas, A. ;
Gelzinis, A. ;
Vegiene, A. .
EXPERT SYSTEMS WITH APPLICATIONS, 2012, 39 (05) :5506-5512
[38]   Diversity Forests: Using Split Sampling to Enable Innovative Complex Split Procedures in Random Forests [J].
Roman Hornung .
SN Computer Science, 2022, 3 (1)
[39]   A novel approach to train random forests on GPU for computer vision applications using local features [J].
Pianu, Daniele ;
Nerino, Roberto ;
Ferraris, Claudia ;
Chimienti, Antonio .
INTERNATIONAL JOURNAL OF HIGH PERFORMANCE COMPUTING APPLICATIONS, 2016, 30 (03) :290-304
[40]   Multi-view action recognition using local similarity random forests and sensor fusion [J].
Zhu, Fan ;
Shao, Ling ;
Lin, Mingxiu .
PATTERN RECOGNITION LETTERS, 2013, 34 (01) :20-24