Research on parallel distributed clustering algorithm applied to cutting parameter optimization

被引:0
作者
Xudong Wei
Qingzhen Sun
Xianli Liu
Caixu Yue
Steven Y. Liang
Lihui Wang
机构
[1] Harbin University of Science and Technology,Key Laboratory of Advanced Manufacturing and Intelligent Technology, Ministry of Education
[2] Woodruff School of Mechanical Engineering,George W
[3] Georgia Institute of Technology,Department of Production Engineering
[4] KTH Royal Institute of Technology,undefined
来源
The International Journal of Advanced Manufacturing Technology | 2022年 / 120卷
关键词
Big data; Data mining; Distributed clustering; T.; -means algorithm; MapReduce framework; Cutting parameter optimization;
D O I
暂无
中图分类号
学科分类号
摘要
In the big data era, traditional data mining technology cannot meet the requirements of massive data processing with the background of intelligent manufacturing. Aiming at insufficient computing power and low efficiency in mining process, this paper proposes a improved K-means clustering algorithm based on the concept of distributed clustering in cloud computing environment. The improved algorithm (T.K-means) is combined with MapReduce computing framework of Hadoop platform to realize parallel computing, so as to perform processing tasks of massive data. In order to verify the practical performance of T.K-means algorithm, taking machining data of milling Ti-6Al-4V alloy as the mining object. The mapping relationship among cutting parameters, surface roughness, and material removal rate is mined, and the optimized value for cutting parameters is obtained. The results show that T.K-means algorithm can be used to mine the optimal cutting parameters, so that the best surface roughness can be obtained in milling Ti-6Al-4V titanium alloy.
引用
收藏
页码:7895 / 7904
页数:9
相关论文
共 50 条
  • [41] PSCAN: A Parallel Structural Clustering Algorithm for Big Networks in MapReduce
    Zhao, Weizhong
    Martha, VenkataSwamy
    Xu, Xiaowei
    2013 IEEE 27TH INTERNATIONAL CONFERENCE ON ADVANCED INFORMATION NETWORKING AND APPLICATIONS (AINA), 2013, : 862 - 869
  • [42] Parallel Evolutionary Algorithm for EEG Optimization Problems
    Meselhi, Mohamed A.
    Elsayed, Saber M.
    Sarker, Ruhul A.
    Essam, Daryl L.
    2021 IEEE CONGRESS ON EVOLUTIONARY COMPUTATION (CEC 2021), 2021, : 2577 - 2584
  • [43] Parallel Implementation of Density Peaks Clustering Algorithm Based on Spark
    Liu, Rui
    Li, Xiaoge
    Du, Liping
    Zhi, Shuting
    Wei, Mian
    ADVANCES IN INFORMATION AND COMMUNICATION TECHNOLOGY, 2017, 107 : 442 - 447
  • [44] A parallel and scalable CAST-based clustering algorithm on GPU
    Kawuu W. Lin
    Chun-Hung Lin
    Chun-Yuan Hsiao
    Soft Computing, 2014, 18 : 539 - 547
  • [45] Design and Evaluation of a Parallel Execution Framework for the CLEVER Clustering Algorithm
    Chen, Chung Sheng
    Shaikh, Nauful
    Charoenrattanaruk, Panitee
    Eick, Christoph F.
    Rizk, Nouhad
    Gabriel, Edgar
    APPLICATIONS, TOOLS AND TECHNIQUES ON THE ROAD TO EXASCALE COMPUTING, 2012, 22 : 73 - 80
  • [46] A parallel and scalable CAST-based clustering algorithm on GPU
    Lin, Kawuu W.
    Lin, Chun-Hung
    Hsiao, Chun-Yuan
    SOFT COMPUTING, 2014, 18 (03) : 539 - 547
  • [47] Parallel K-Medoids Clustering Algorithm Based on Hadoop
    Jiang, Yaobin
    Zhang, Jiongmin
    2014 5TH IEEE INTERNATIONAL CONFERENCE ON SOFTWARE ENGINEERING AND SERVICE SCIENCE (ICSESS), 2014, : 649 - 652
  • [48] FPO tree and DP3 algorithm for distributed parallel Frequent Itemsets Mining
    Van Quoc Phuong Huynh
    Kueng, Josef
    EXPERT SYSTEMS WITH APPLICATIONS, 2020, 140
  • [49] Distributed Energy Efficient Clustering Algorithm for Wireless Sensor Networks
    Arunraja, Muruganantham
    Malathi, Veluchamy
    Sakthivel, Erulappan
    INFORMACIJE MIDEM-JOURNAL OF MICROELECTRONICS ELECTRONIC COMPONENTS AND MATERIALS, 2015, 45 (03): : 180 - 187
  • [50] Two-phase clustering algorithm for complex distributed data
    Gong M.-G.
    Wang S.
    Ma M.
    Cao Y.
    Jiao L.-C.
    Ma W.-P.
    Ruan Jian Xue Bao/Journal of Software, 2011, 22 (11): : 2760 - 2772