Hierarchical disjoint principal component analysis

被引:2
|
作者
Cavicchia, Carlo [1 ]
Vichi, Maurizio [2 ]
Zaccaria, Giorgia [2 ]
机构
[1] Erasmus Univ, Econometr Inst, Rotterdam, Netherlands
[2] Univ Roma La Sapienza, Dept Stat Sci, Rome, Italy
关键词
Dimension reduction; Hierarchical models; Parsimonious trees; Reflective models; Formative models; HIGHER-ORDER FACTORS; STATISTICAL VARIABLES; GENERAL INTELLIGENCE; CAUSAL INDICATORS; PERSONALITY; MODEL; COMPOSITE; COMPLEX; PCA;
D O I
10.1007/s10182-022-00458-4
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Dimension reduction, by means of Principal Component Analysis (PCA), is often employed to obtain a reduced set of components preserving the largest possible part of the total variance of the observed variables. Several methodologies have been proposed either to improve the interpretation of PCA results (e.g., by means of orthogonal, oblique rotations, shrinkage methods), or to model oblique components or factors with a hierarchical structure, such as in Bi-factor and High-Order Factor analyses. In this paper, we propose a new methodology, called Hierarchical Disjoint Principal Component Analysis (HierDPCA), that aims at building a hierarchy of disjoint principal components of maximum variance associated with disjoint groups of observed variables, from Q up to a unique, general one. HierDPCA also allows choosing the type of the relationship among disjoint principal components of two sequential levels, from the lowest upwards, by testing the component correlation per level and changing from a reflective to a formative approach when this correlation turns out to be not statistically significant. The methodology is formulated in a semi-parametric least-squares framework and a coordinate descent algorithm is proposed to estimate the model parameters. A simulation study and two real applications are illustrated to highlight the empirical properties of the proposed methodology.
引用
收藏
页码:537 / 574
页数:38
相关论文
共 50 条
  • [1] Hierarchical disjoint principal component analysis
    Carlo Cavicchia
    Maurizio Vichi
    Giorgia Zaccaria
    AStA Advances in Statistical Analysis, 2023, 107 : 537 - 574
  • [2] Shedding new light on Hierarchical Principal Component Analysis
    Hanafi, Mohamed
    Kohler, Achim
    Qannari, El Mostafa
    JOURNAL OF CHEMOMETRICS, 2010, 24 (11-12) : 703 - 709
  • [3] Dimension reduction in principal component analysis for trees
    Alfaro, Carlos A.
    Aydin, Burcu
    Valencia, Carlos E.
    Bullitt, Elizabeth
    Ladha, Alim
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2014, 74 : 157 - 179
  • [4] Efficient fair principal component analysis
    Mohammad Mahdi Kamani
    Farzin Haddadpour
    Rana Forsati
    Mehrdad Mahdavi
    Machine Learning, 2022, 111 : 3671 - 3702
  • [5] Sparse Generalised Principal Component Analysis
    Smallman, Luke
    Artemiou, Andreas
    Morgan, Jennifer
    PATTERN RECOGNITION, 2018, 83 : 443 - 455
  • [6] Bilinear Probabilistic Principal Component Analysis
    Zhao, Jianhua
    Yu, Philip L. H.
    Kwok, James T.
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2012, 23 (03) : 492 - 503
  • [7] Efficient fair principal component analysis
    Kamani, Mohammad Mahdi
    Haddadpour, Farzin
    Forsati, Rana
    Mahdavi, Mehrdad
    MACHINE LEARNING, 2022, 111 (10) : 3671 - 3702
  • [8] Hierarchical sparse functional principal component analysis for multistage multivariate profile data
    Wang, Kai
    Tsung, Fugee
    IISE TRANSACTIONS, 2021, 53 (01) : 58 - 73
  • [9] Principal component analysis
    Bro, Rasmus
    Smilde, Age K.
    ANALYTICAL METHODS, 2014, 6 (09) : 2812 - 2831
  • [10] Segmented principal component transform-principal component analysis
    Barros, AS
    Rutledge, DN
    CHEMOMETRICS AND INTELLIGENT LABORATORY SYSTEMS, 2005, 78 (1-2) : 125 - 137