Fast Discrete Distribution Clustering Using Wasserstein Barycenter With Sparse Support

被引:74
作者
Ye, Jianbo [1 ]
Wu, Panruo [2 ]
Wang, James Z. [1 ]
Li, Jia [3 ]
机构
[1] Penn State Univ, Coll Informat Sci & Technol, University Pk, PA 16802 USA
[2] Univ Calif Riverside, Dept Comp Sci & Engn, Riverside, CA 92521 USA
[3] Penn State Univ, Dept Stat, University Pk, PA 16802 USA
基金
美国国家科学基金会;
关键词
Discrete distribution; K-means; clustering; large-scale learning; parallel computing; ADMM; EARTH MOVERS DISTANCE;
D O I
10.1109/TSP.2017.2659647
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
In a variety of research areas, the weighted bag of vectors and the histogram are widely used descriptors for complex objects. Both can be expressed as discrete distributions. D2-clustering pursues the minimum total within-cluster variation for a set of discrete distributions subject to the Kantorovich-Wasserstein metric. D2-clustering has a severe scalability issue, the bottleneck being the computation of a centroid distribution, called Wasserstein barycenter, that minimizes its sum of squared distances to the cluster members. In this paper, we develop a modified Bregman ADMM approach for computing the approximate discrete Wasserstein barycenter of large clusters. In the case when the support points of the barycenters are unknown and have low cardinality, our method achieves high accuracy empirically at a much reduced computational cost. The strengths and weaknesses of our method and its alternatives are examined through experiments, and we recommend scenarios for their respective usage. Moreover, we develop both serial and parallelized versions of the algorithm. By experimenting with large-scale data, we demonstrate the computational efficiency of the new methods and investigate their convergence properties and numerical stability. The clustering results obtained on several datasets in different domains are highly competitive in comparison with some widely used methods in the corresponding areas.
引用
收藏
页码:2317 / 2332
页数:16
相关论文
共 50 条
  • [1] Solving Soft Clustering Ensemble via k-Sparse Discrete Wasserstein Barycenter
    Qin, Ruizhe
    Li, Mengying
    Ding, Hu
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021,
  • [2] SCALING UP DISCRETE DISTRIBUTION CLUSTERING USING ADMM
    Ye, Jianbo
    Li, Jia
    2014 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2014, : 5267 - 5271
  • [3] Hybrid Wasserstein distance and fast distribution clustering
    Verdinelli, Isabella
    Wasserman, Larry
    ELECTRONIC JOURNAL OF STATISTICS, 2019, 13 (02): : 5088 - 5119
  • [4] Sparse Pinball Twin Bounded Support Vector Clustering
    Tanveer, M.
    Tabish, M.
    Jangir, Jatin
    IEEE TRANSACTIONS ON COMPUTATIONAL SOCIAL SYSTEMS, 2022, 9 (06): : 1820 - 1829
  • [5] Dynamic clustering of interval data using a Wasserstein-based distance
    Irpino, Antonio
    Verde, Rosanna
    PATTERN RECOGNITION LETTERS, 2008, 29 (11) : 1648 - 1658
  • [6] Unsupervised Learning Model of Sparse Filtering Enhanced Using Wasserstein Distance for Intelligent Fault Diagnosis
    Govind Vashishtha
    Rajesh Kumar
    Journal of Vibration Engineering & Technologies, 2023, 11 : 2985 - 3002
  • [7] Unsupervised Learning Model of Sparse Filtering Enhanced Using Wasserstein Distance for Intelligent Fault Diagnosis
    Vashishtha, Govind
    Kumar, Rajesh
    JOURNAL OF VIBRATION ENGINEERING & TECHNOLOGIES, 2023, 11 (07) : 2985 - 3002
  • [8] Clustering and feature selection using sparse principal component analysis
    Luss, Ronny
    d'Aspremont, Alexandre
    OPTIMIZATION AND ENGINEERING, 2010, 11 (01) : 145 - 157
  • [9] Clustering and feature selection using sparse principal component analysis
    Ronny Luss
    Alexandre d’Aspremont
    Optimization and Engineering, 2010, 11 : 145 - 157
  • [10] FAST CLUSTERING WITH CO-CLUSTERING VIA DISCRETE NON-NEGATIVE MATRIX FACTORIZATION FOR IMAGE IDENTIFICATION
    Nie, Feiping
    Pei, Shenfei
    Wang, Rong
    Li, Xuelong
    2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 2073 - 2077