A Bayesian Mixture Model of Temporal Point Processes With Determinantal Point Process Prior

被引:0
作者
Dong, Yiwei [1 ]
Ye, Shaoxin [1 ]
Han, Qiyu [1 ]
Cao, Yuwen [1 ]
Xu, Hongteng [2 ]
Yang, Hanfang [1 ]
机构
[1] Renmin Univ China, Sch Stat, Beijing 100872, Peoples R China
[2] Renmin Univ China, Gaoling Sch Artificial Intelligence, Beijing 100872, Peoples R China
关键词
Mixture models; Bayes methods; Training; Stochastic processes; Clustering algorithms; Signal processing algorithms; Recurrent neural networks; Overfitting; Maximum likelihood estimation; Inference algorithms; Bayesian mixture model; temporal point processes; Gibbs sampling; Bayesian neural network; Markov chain Monte Carlo; INFERENCE;
D O I
10.1109/TSP.2025.3575175
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Asynchronous event sequence clustering aims to group similar event sequences in an unsupervised manner. Mixture models of temporal point processes have been proposed to solve this problem, but they often suffer from overfitting, leading to excessive cluster generation with a lack of diversity. To overcome these limitations, we propose a Bayesian mixture model of Temporal Point Processes with Determinantal Point Process Prior ((TPDP2)-D-2) and accordingly an efficient posterior inference algorithm based on conditional Gibbs sampling. Our work provides a flexible learning framework for event sequence clustering, enabling automatic identification of the potential number of clusters and accurate grouping of sequences with similar features. It is applicable to a wide range of parametric temporal point processes, including neural network-based models. Experimental results on both synthetic and real-world data suggest that our framework could produce moderately fewer yet more diverse mixture components, and achieve outstanding results across multiple evaluation metrics.
引用
收藏
页码:2216 / 2226
页数:11
相关论文
共 45 条
[1]  
[Anonymous], 2013, PMLR
[2]   IS INFINITY THAT FAR? A BAYESIAN NONPARAMETRIC PERSPECTIVE OF FINITE MIXTURE MODELS [J].
Argiento, Raffaele ;
De Iorio, Maria .
ANNALS OF STATISTICS, 2022, 50 (05) :2641-2663
[3]  
Baydin AG, 2018, J MACH LEARN RES, V18
[4]   MCMC Computations for Bayesian Mixture Models Using Repulsive Point Processes [J].
Beraha, Mario ;
Argiento, Raffaele ;
Moller, Jesper ;
Guglielmi, Alessandra .
JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 2022, 31 (02) :422-435
[5]   Determinantal Point Process Mixtures Via Spectral Density Approach [J].
Bianchini, Ilaria ;
Guglielmi, Alessandra ;
Quintana, Fernando A. .
BAYESIAN ANALYSIS, 2020, 15 (01) :187-214
[6]  
Chen R.T.Q., 2021, P INT C LEARN REPR
[7]  
Daley D., 2003, An Introduction to the Theory of Point Processes: Volume I: Elementary Theory and Methods, P19
[8]   Efficient Event Series Data Modeling via First-Order Constrained Optimization [J].
Dalmasso, Niccolo ;
Zhao, Renbo ;
Ghassemi, Mohsen ;
Potluru, Vamsi K. ;
Balch, Tucker ;
Veloso, Manuela .
PROCEEDINGS OF THE 4TH ACM INTERNATIONAL CONFERENCE ON AI IN FINANCE, ICAIF 2023, 2023, :463-471
[9]  
Daxberger E., 2021, P INT C MACH LEARN P, P2510
[10]   The Ginibre Point Process as a Model for Wireless Networks With Repulsion [J].
Deng, Na ;
Zhou, Wuyang ;
Haenggi, Martin .
IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2015, 14 (01) :107-121