EDMD: An Entropy based Dissimilarity measure to cluster Mixed-categorical Data

被引:4
作者
Kar, Amit Kumar [1 ]
Akhter, Mohammad Maksood [1 ]
Mishra, Amaresh Chandra [2 ]
Mohanty, Sraban Kumar [1 ]
机构
[1] PDPM Indian Inst Informat Technol Design & Mfg, Comp Sci & Engn, Jabalpur 482005, India
[2] PDPM Indian Inst Informat Technol Design & Mfg, Nat Sci, Jabalpur 482005, India
关键词
Proximity measure; Mixed categorical data; Ordinal attributes; Nominal attributes; Entropy; Dissimilarity measure; ALGORITHM; ATTRIBUTE; DISTANCE;
D O I
10.1016/j.patcog.2024.110674
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The effectiveness of clustering techniques is significantly influenced by proximity measures irrespective of type of data and categorical data is no exception. Most of the existing proximity measures for categorical data assume that all attributes contribute equally to the distance measurement which is not true. Usually, frequency or probability-based approaches are better equipped in principle to counter this issue by appropriately weighting the attributes based on the intra-attribute statistical information. However, owing to the qualitative nature of categorical features, the intra-attribute disorder is not captured effectively by the popularly used continuum form of entropy known as Shannon or information entropy. If the categorical data contains ordinal features, then the problem multiplies because the existing measures treat all attributes as nominal. To address these issues, we propose a new Entropy-based Dissimilarity measure for Mixed categorical Data (EDMD) composed of both nominal and ordinal attributes. EDMD treats both nominal and ordinal attributes separately to capture the intrinsic information from the values of two different attribute types. We apply Boltzmann's definition of entropy, which is based on the principle of counting microstates, to exploit the intra-attribute statistical information of nominal attributes while preserving the order relationships among ordinal values in distance formulation. Additionally, the statistical significance of different attributes of the data towards dissimilarity computation is taken care of through attribute weighting. The proposed measure is free from any user-defined or domain-specific parameters and there is no prior assumption about the distribution of the data sets. Experimental results demonstrate the efficacy of EDMD in terms of cluster quality, accuracy, cluster discrimination ability, and execution time to handle mixed categorical data sets of different characteristics.
引用
收藏
页数:15
相关论文
共 39 条
[21]   An efficient entropy based dissimilarity measure to cluster categorical data [J].
Kar, Amit Kumar ;
Mishra, Amaresh Chandra ;
Mohanty, Sraban Kumar .
ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2023, 119
[22]   An entropy-based weighted dissimilarity metric for numerical data clustering using the distribution of intra feature differences [J].
Khan, Abdul Atif ;
Mishra, Amaresh Chandra ;
Mohanty, Sraban Kumar .
KNOWLEDGE-BASED SYSTEMS, 2023, 280
[23]   Incremental entropy-based clustering on categorical data streams with concept drift [J].
Li, Yanhong ;
Li, Deyu ;
Wang, Suge ;
Zhai, Yanhui .
KNOWLEDGE-BASED SYSTEMS, 2014, 59 :33-47
[24]  
Lin D., 1998, Machine Learning. Proceedings of the Fifteenth International Conference (ICML'98), P296
[25]   DCSNE: Density-based Clustering using Graph Shared Neighbors and Entropy [J].
Maheshwari, Rashmi ;
Mohanty, Sraban Kumar ;
Mishra, Amaresh Chandra .
PATTERN RECOGNITION, 2023, 137
[26]   SEND: A novel dissimilarity metric using ensemble properties of the feature space for clustering numerical data [J].
Mishra, Gaurav ;
Kar, Amit Kumar ;
Mishra, Amaresh Chandra ;
Mohanty, Sraban Kumar ;
Panda, M. K. .
INFORMATION SCIENCES, 2021, 574 :279-296
[27]   A generalized multi-aspect distance metric for mixed-type data clustering [J].
Mousavi, Elahe ;
Sehhati, Mohammadreza .
PATTERN RECOGNITION, 2023, 138
[28]   Generalizing the Wilcoxon rank-sum test for interval data [J].
Perolat, Julien ;
Couso, Ines ;
Loquin, Kevin ;
Strauss, Olivier .
INTERNATIONAL JOURNAL OF APPROXIMATE REASONING, 2015, 56 :108-121
[30]   ON EXACT METHODS IN SYSTEMATICS [J].
SMIRNOV, ES .
SYSTEMATIC ZOOLOGY, 1968, 17 (01) :1-&