Simple and Efficient Parallelization for Probabilistic Temporal Tensor Factorization

被引:0
作者
Li, Guangxi [1 ]
Xu, Zenglin [1 ]
Wang, Linnan [1 ]
Ye, Jinmian [1 ]
King, Irwin [2 ]
Lyu, Michael [2 ]
机构
[1] Univ Elect Sci & Technol China, Sch Com Sci & Engn, Big Data Res Ctr, Chengdu, Sichuan, Peoples R China
[2] Chinese Univ Hong Kong, Dept Comp Sci & Technol, Shatian, Hong Kong, Peoples R China
来源
2017 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN) | 2017年
基金
中国博士后科学基金; 国家高技术研究发展计划(863计划);
关键词
ALGORITHMS;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Probabilistic Temporal Tensor Factorization (PTTF) is an effective algorithm to model the temporal tensor data. It leverages a time constraint to capture the evolving properties of tensor data. Nowadays the exploding dataset demands a large scale PTTF analysis, and a parallel solution is critical to accommodate the trend. Whereas, the parallelization of PTTF still remains unexplored. In this paper, we propose a simple yet efficient Parallel Probabilistic Temporal Tensor Factorization, referred to as P2T2F, to provide a scalable PTTF solution. P2T2F is fundamentally disparate from existing parallel tensor factorizations by considering the probabilistic decomposition and the temporal effects of tensor data. It adopts a new tensor data split strategy to subdivide a large tensor into independent sub-tensors, the computation of which is inherently parallel. We train P2T2F with an efficient algorithm of stochastic Alternating Direction Method of Multipliers, and show that the convergence is guaranteed. Experiments on several real-word tensor datasets demonstrate that P2T2F is a highly effective and efficiently scalable algorithm dedicated for large scale probabilistic temporal tensor analysis.
引用
收藏
页码:1 / 8
页数:8
相关论文
共 35 条
  • [1] [Anonymous], 2012, P 29 INT C MACH LEAR
  • [2] [Anonymous], 2011, Proceedings of the 17th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD)
  • [3] [Anonymous], 2013, PROC INT C MACH LEAR
  • [4] [Anonymous], NIPS
  • [5] [Anonymous], 2010, PROC SIAM INT C DATA, DOI DOI 10.1137/1.9781611972801.19
  • [6] [Anonymous], ARXIV161104255
  • [7] Numerical operator calculus in higher dimensions
    Beylkin, G
    Mohlenkamp, MJ
    [J]. PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2002, 99 (16) : 10246 - 10251
  • [8] Large-Scale Machine Learning with Stochastic Gradient Descent
    Bottou, Leon
    [J]. COMPSTAT'2010: 19TH INTERNATIONAL CONFERENCE ON COMPUTATIONAL STATISTICS, 2010, : 177 - 186
  • [9] Chen S., 2013, Advances in neural information processing systems, P1691
  • [10] Chi Yun., 2008, Proceedings of the 17th ACM conference on Information and knowledge management, P941