Sparse functional linear models via calibrated concave-convex procedure

被引:0
作者
Lee, Young Joo [1 ]
Jeon, Yongho [1 ]
机构
[1] Yonsei Univ, Dept Appl Stat, 50 Yonsei Ro, Seoul 03722, South Korea
基金
新加坡国家研究基金会;
关键词
Functional regression; Variable selection; High-dimensional regression; CCCP-SCAD; Gene expression data; VARIABLE SELECTION; CELL-CYCLE; GENE-EXPRESSION; ADAPTIVE LASSO; REGRESSION; GCR2;
D O I
10.1007/s42952-023-00242-3
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
In this paper, we propose a calibrated ConCave-Convex Procedure (CCCP) for variable selection in high-dimensional functional linear models. The calibrated CCCP approach for the Smoothly Clipped Absolute Deviation (SCAD) penalty is known to produce a consistent solution path with probability converging to one in linear models. We incorporate the SCAD penalty into function-on-scalar regression models and phrase them as a type of group-penalized estimation using a basis expansion approach. We then implement the calibrated CCCP method to solve the nonconvex group-penalized problem. For the tuning procedure, we use the Extended Bayesian Information Criterion (EBIC) to ensure consistency in high-dimensional settings. In simulation studies, we compare the performance of the proposed method with two existing convex-penalized estimators in terms of variable selection consistency and prediction accuracy. Lastly, we apply the method to the gene expression dataset for sparsely estimating the time-varying effects of transcription factors on the regulation of yeast cell cycle genes.
引用
收藏
页码:189 / 207
页数:19
相关论文
共 16 条
  • [1] Sparse functional linear models via calibrated concave-convex procedure
    Young Joo Lee
    Yongho Jeon
    Journal of the Korean Statistical Society, 2024, 53 : 189 - 207
  • [2] Sparse graphical models via calibrated concave convex procedure with application to fMRI data
    Son, Sungtaek
    Park, Cheolwoo
    Jeon, Yongho
    JOURNAL OF APPLIED STATISTICS, 2020, 47 (06) : 997 - 1016
  • [3] ConCave-Convex procedure for support vector machines with Huber loss for text classification
    Borah, Parashjyoti
    Gupta, Deepak
    Hazarika, Barenya Bikash
    COMPUTERS & ELECTRICAL ENGINEERING, 2025, 122
  • [4] Variable selection in multivariate linear models for functional data via sparse regularization
    Matsui, Hidetoshi
    Umezu, Yuta
    JAPANESE JOURNAL OF STATISTICS AND DATA SCIENCE, 2020, 3 (02) : 453 - 467
  • [5] Variable selection in multivariate linear models for functional data via sparse regularization
    Hidetoshi Matsui
    Yuta Umezu
    Japanese Journal of Statistics and Data Science, 2020, 3 : 453 - 467
  • [6] NON-CONVEX SPARSE DEVIATION MODELING VIA GENERATIVE MODELS
    Yang, Yaxi
    Wang, Hailin
    Qiu, Haiquan
    Wang, Jianjun
    Wang, Yao
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 2345 - 2349
  • [7] Functional linear regression for functional response via sparse basis selection
    Han, Kyunghee
    Shin, Hyejin
    JOURNAL OF THE KOREAN STATISTICAL SOCIETY, 2015, 44 (03) : 376 - 389
  • [8] SPARSE ESTIMATION OF GENERALIZED LINEAR MODELS (GLM) VIA APPROXIMATED INFORMATION CRITERIA
    Su, Xiaogang
    Fan, Juanjuan
    Levine, Richard A.
    Nunn, Martha E.
    Tsai, Chih-Ling
    STATISTICA SINICA, 2018, 28 (03) : 1561 - 1581
  • [9] Least-Squares Linear Dilation-Erosion Regressor Trained Using a Convex-Concave Procedure
    Oliveira, Angelica Lourenco
    Valle, Marcos Eduardo
    INTELLIGENT SYSTEMS, PT II, 2022, 13654 : 16 - 29
  • [10] The Contextual Lasso: Sparse Linear Models via Deep Neural Networks
    Thompson, Ryan
    Dezfouli, Amir
    Kohn, Robert
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,