Non-negative Matrix Factorization under Heavy Noise

被引:0
|
作者
Bhattacharya, Chiranjib [1 ]
Goyal, Navin [2 ]
Kannan, Ravindran [2 ]
Pani, Jagdeep [1 ]
机构
[1] Indian Inst Sci, Dept Comp Sci & Automat, Bangalore, Karnataka, India
[2] Microsoft Res India, Bangalore, Karnataka, India
关键词
ALGORITHM;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The Noisy Non-negative Matrix factorization (NMF) is: given a data matrix A (d x n), find non-negative matrices B;C (d x k, k x n respy.) so that A = BC + N, where N is a noise matrix. Existing polynomial time algorithms with proven error guarantees require each column N-x;j to have l(1) norm much smaller than parallel to(BC)(x;j)parallel to(1), which could be very restrictive. In important applications of NMF such as Topic Modeling as well as theoretical noise models (eg. Gaussian with high sigma), almost every column of N-xj violates this condition. We introduce the heavy noise model which only requires the average noise over large subsets of columns to be small. We initiate a study of Noisy NMF under the heavy noise model. We show that our noise model subsumes noise models of theoretical and practical interest (for eg. Gaussian noise of maximum possible sigma). We then devise an algorithm TSVDNMF which under certain assumptions on B;C, solves the problem under heavy noise. Our error guarantees match those of previous algorithms. Our running time of O ((n + d)(2)k) is substantially better than the O(n(3)d) for the previous best. Our assumption on B is weaker than the "Separability" assumption made by all previous results. We provide empirical justification for our assumptions on C. We also provide the first proof of identifiability (uniqueness of B) for noisy NMF which is not based on separability and does not use hard to check geometric conditions. Our algorithm outperforms earlier polynomial time algorithms both in time and error, particularly in the presence of high noise.
引用
收藏
页数:9
相关论文
共 50 条
  • [31] Non-negative matrix factorization with sparseness constraints
    Hoyer, PO
    JOURNAL OF MACHINE LEARNING RESEARCH, 2004, 5 : 1457 - 1469
  • [32] Non-negative Matrix Factorization for Binary Data
    Larsen, Jacob Sogaard
    Clemmensen, Line Katrine Harder
    2015 7TH INTERNATIONAL JOINT CONFERENCE ON KNOWLEDGE DISCOVERY, KNOWLEDGE ENGINEERING AND KNOWLEDGE MANAGEMENT (IC3K), 2015, : 555 - 563
  • [33] Feature Weighted Non-Negative Matrix Factorization
    Chen, Mulin
    Gong, Maoguo
    Li, Xuelong
    IEEE TRANSACTIONS ON CYBERNETICS, 2023, 53 (02) : 1093 - 1105
  • [34] Convex Non-Negative Matrix Factorization in the Wild
    Thurau, Christian
    Kersting, Kristian
    Bauckhage, Christian
    2009 9TH IEEE INTERNATIONAL CONFERENCE ON DATA MINING, 2009, : 523 - 532
  • [35] Initialization enhancer for non-negative matrix factorization
    Zheng, Zhonglong
    Yang, Jie
    Zhu, Yitan
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2007, 20 (01) : 101 - 110
  • [36] Face recognition with non-negative matrix factorization
    Rajapakse, M
    Wyse, L
    VISUAL COMMUNICATIONS AND IMAGE PROCESSING 2003, PTS 1-3, 2003, 5150 : 1838 - 1847
  • [37] Correntropy Supervised Non-negative Matrix Factorization
    Zhang, Wenju
    Guan, Naiyang
    Tao, Dacheng
    Mao, Bin
    Huang, Xuhui
    Luo, Zhigang
    2015 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2015,
  • [38] Robust discriminative non-negative matrix factorization
    Zhang, Ruiqing
    Hu, Zhenfang
    Pan, Gang
    Wang, Yueming
    NEUROCOMPUTING, 2016, 173 : 552 - 561
  • [39] Probabilistic Sparse Non-negative Matrix Factorization
    Hinrich, Jesper Love
    Morup, Morten
    LATENT VARIABLE ANALYSIS AND SIGNAL SEPARATION (LVA/ICA 2018), 2018, 10891 : 488 - 498
  • [40] Truncated Cauchy Non-Negative Matrix Factorization
    Guan, Naiyang
    Liu, Tongliang
    Zhang, Yangmuzi
    Tao, Dacheng
    Davis, Larry S.
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2019, 41 (01) : 246 - 259