Fast, memory-efficient low-rank approximation of SimRank

被引:3
|
作者
Oseledets I.V. [1 ,2 ]
Ovchinnikov G.V. [1 ,3 ,4 ]
Katrutsa A.M. [1 ,5 ]
机构
[1] Skolkovo Institute of Science and Technology, Novaya St., 100, Skolkovo
[2] Institute of Numerical Mathematics, Russian Academy of Sciences, Gubkina Street, 8, Moscow
[3] Institute for Design Problems in Microelectronics, Russian Academy of Sciences, Prosp. 60-letiya Oktyabrya, 9, Moscow
[4] Innopolis University, 40-42 Profsoyuznaya str, Kazan
[5] Moscow Institute of Physics and Technology, Institutskiy Lane 9, Dolgoprudny, Moscow
基金
俄罗斯基础研究基金会; 俄罗斯科学基金会;
关键词
Link Analysis; Lyapunov equation; Probabilistic svd; Similarity Search; SimRank; Svd;
D O I
10.1093/comnet/cnw008
中图分类号
学科分类号
摘要
SimRank is a well-known similarity measure between graph vertices. This measure relies on a graph topology and is built on the intuition that 'two objects are similar if they are related to similar objects'. Formalization of this intuition leads to a system of linear equations, so large that even the solution vector will not fit in the RAM for many real-world graphs. To solve this issue we propose a new method based on a low-rank approximation of the linear system solution. This approach requires less memory and provides a significant calculation speed-up. The experiments show that the proposed method provides an accurate SimRank approximation. © The authors 2016. Published by Oxford University Press. All rights reserved.
引用
收藏
页码:111 / 126
页数:15
相关论文
共 50 条
  • [1] LOW-RANK GRADIENT APPROXIMATION FOR MEMORY-EFFICIENT ON-DEVICE TRAINING OF DEEP NEURAL NETWORK
    Gooneratne, Mary
    Sim, Khe Chai
    Zadrazil, Petr
    Kabel, Andreas
    Beaufays, Francoise
    Motta, Giovanni
    2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 3017 - 3021
  • [2] Fast and Memory Optimal Low-Rank Matrix Approximation
    Yun, Se-Young
    Lelarge, Marc
    Proutiere, Alexandre
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 28 (NIPS 2015), 2015, 28
  • [3] A Fast and Efficient Algorithm for Low-rank Approximation of a Matrix
    Nguyen, Nam H.
    Do, Thong T.
    Tran, Trac D.
    STOC'09: PROCEEDINGS OF THE 2009 ACM SYMPOSIUM ON THEORY OF COMPUTING, 2009, : 215 - 224
  • [4] Low-Rank Gradient Descent for Memory-Efficient Training of Deep In-Memory Arrays
    Huang, Siyuan
    Hoskins, Brian D.
    Daniels, Matthew W.
    Stiles, Mark D.
    Adam, Gina C.
    ACM JOURNAL ON EMERGING TECHNOLOGIES IN COMPUTING SYSTEMS, 2023, 19 (02)
  • [5] TT-TSDF: Memory-Efficient TSDF with Low-Rank Tensor Train Decomposition
    Boyko, Alexey, I
    Matrosov, Mikhail P.
    Oseledets, Ivan, V
    Tsetserukou, Dzmitry
    Ferrer, Gonzalo
    2020 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2020, : 10116 - 10121
  • [6] Low-rank approximation for fast image acquisition
    Popescu, Dan C.
    Hislop, Greg
    Hellicar, Andrew
    ADVANCED CONCEPTS FOR INTELLIGENT VISION SYSTEMS, PROCEEDINGS, 2007, 4678 : 242 - 253
  • [7] Fast low-rank approximation for covariance matrices
    Belabbas, Mohamed-Ali
    Wolfe, Patrick J.
    2007 2ND IEEE INTERNATIONAL WORKSHOP ON COMPUTATIONAL ADVANCES IN MULTI-SENSOR ADAPTIVE PROCESSING, 2007, : 181 - 184
  • [8] Adaptive sampling and fast low-rank matrix approximation
    Deshpande, Amit
    Vempala, Santosh
    APPROXIMATION, RANDOMIZATION AND COMBINATORIAL OPTIMIZATION: ALGORITHMS AND TECHNIQUES, 2006, 4110 : 292 - 303
  • [9] Fast computation of convolution operations via low-rank approximation
    Hearn, Tristan A.
    Reichel, Lothar
    APPLIED NUMERICAL MATHEMATICS, 2014, 75 : 136 - 153
  • [10] Facto-CNN: Memory-Efficient CNN Training with Low-rank Tensor Factorization and Lossy Tensor Compression
    Lee, Seungtae
    Ko, Jonghwan
    Hong, Seokin
    ASIAN CONFERENCE ON MACHINE LEARNING, VOL 222, 2023, 222