Greedy low-rank algorithm for spatial connectome regression

被引:5
|
作者
Kuerschner, Patrick [1 ]
Dolgov, Sergey [2 ]
Harris, Kameron Decker [3 ]
Benner, Peter [4 ]
机构
[1] Katholieke Univ Leuven, Dept Elect Engn ESAT STADIUS, Leuven, Belgium
[2] Univ Bath, Dept Math Sci, Bath, Avon, England
[3] Univ Washington, Paul G Allen Sch Comp Sci & Engn, Biol, Seattle, WA 98195 USA
[4] Max Planck Inst Dynam Complex Tech Syst, Computat Methods Syst & Control Theory, Magdeburg, Germany
来源
基金
英国工程与自然科学研究理事会;
关键词
Matrix equations; Computational neuroscience; Low-rank approximation; Networks; KRYLOV SUBSPACE METHODS; MATRIX; OPTIMIZATION;
D O I
10.1186/s13408-019-0077-0
中图分类号
Q [生物科学];
学科分类号
07 ; 0710 ; 09 ;
摘要
Recovering brain connectivity from tract tracing data is an important computational problem in the neurosciences. Mesoscopic connectome reconstruction was previously formulated as a structured matrix regression problem (Harris et al. in Neural Information Processing Systems, 2016), but existing techniques do not scale to the whole-brain setting. The corresponding matrix equation is challenging to solve due to large scale, ill-conditioning, and a general form that lacks a convergent splitting. We propose a greedy low-rank algorithm for the connectome reconstruction problem in very high dimensions. The algorithm approximates the solution by a sequence of rank-one updates which exploit the sparse and positive definite problem structure. This algorithm was described previously (Kressner and Sirkovic in Numer Lin Alg Appl 22(3):564-583, 2015) but never implemented for this connectome problem, leading to a number of challenges. We have had to design judicious stopping criteria and employ efficient solvers for the three main sub-problems of the algorithm, including an efficient GPU implementation that alleviates the main bottleneck for large datasets. The performance of the method is evaluated on three examples: an artificial "toy" dataset and two whole-cortex instances using data from the Allen Mouse Brain Connectivity Atlas. We find that the method is significantly faster than previous methods and that moderate ranks offer a good approximation. This speedup allows for the estimation of increasingly large-scale connectomes across taxa as these data become available from tracing experiments. The data and code are available online.
引用
收藏
页数:22
相关论文
共 50 条
  • [1] Connectome Smoothing via Low-Rank Approximations
    Tang, Runze
    Ketcha, Michael
    Badea, Alexandra
    Calabrese, Evan D.
    Margulies, Daniel S.
    Vogelstein, Joshua T.
    Priebe, Carey E.
    Sussman, Daniel L.
    IEEE TRANSACTIONS ON MEDICAL IMAGING, 2019, 38 (06) : 1446 - 1456
  • [2] Low-rank and sparse matrices fitting algorithm for low-rank representation
    Zhao, Jianxi
    Zhao, Lina
    COMPUTERS & MATHEMATICS WITH APPLICATIONS, 2020, 79 (02) : 407 - 425
  • [3] Low-Rank Regression with Tensor Responses
    Rabusseau, Guillaume
    Kadri, Hachem
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 29 (NIPS 2016), 2016, 29
  • [4] Low-rank isomap algorithm
    Mehrbani, Eysan
    Kahaei, Mohammad Hossein
    IET SIGNAL PROCESSING, 2022, 16 (05) : 528 - 545
  • [5] LOW-RANK TENSOR HUBER REGRESSION
    Wei, Yangxin
    Luot, Ziyan
    Chen, Yang
    PACIFIC JOURNAL OF OPTIMIZATION, 2022, 18 (02): : 439 - 458
  • [6] Dissociating individual connectome traits using low-rank learning
    Qin, Jian
    Shen, Hui
    Zeng, Ling-Li
    Gao, Kai
    Luo, Zhiguo
    Hu, Dewen
    BRAIN RESEARCH, 2019, 1722
  • [7] Greedy rank updates combined with Riemannian descent methods for low-rank optimization
    Uschmajew, Andre
    Vandereycken, Bart
    2015 INTERNATIONAL CONFERENCE ON SAMPLING THEORY AND APPLICATIONS (SAMPTA), 2015, : 420 - 424
  • [8] Using Low-Rank Approximations to Speed Up Kernel Logistic Regression Algorithm
    Lei, Dajiang
    Tang, Jianyang
    Li, Zhixing
    Wu, Yu
    IEEE ACCESS, 2019, 7 : 84242 - 84252
  • [9] Low-rank Tensor Regression: Scalability and Applications
    Liu, Yan
    2017 IEEE 7TH INTERNATIONAL WORKSHOP ON COMPUTATIONAL ADVANCES IN MULTI-SENSOR ADAPTIVE PROCESSING (CAMSAP), 2017,
  • [10] Low-Rank Tensor Thresholding Ridge Regression
    Guo, Kailing
    Zhang, Tong
    Xu, Xiangmin
    Xing, Xiaofen
    IEEE ACCESS, 2019, 7 : 153761 - 153772