Dynamic Group Learning Distributed Particle Swarm Optimization for Large-Scale Optimization and Its Application in Cloud Workflow Scheduling

被引:175
作者
Wang, Zi-Jia [1 ]
Zhan, Zhi-Hui [2 ,3 ,4 ]
Yu, Wei-Jie [5 ]
Lin, Ying [6 ]
Zhang, Jie [7 ]
Gu, Tian-Long [8 ]
Zhang, Jun [9 ]
机构
[1] Sun Yat Sen Univ, Sch Data & Comp Sci, Guangzhou 510006, Peoples R China
[2] South China Univ Technol, Sch Comp Sci & Engn, Guangzhou 510006, Peoples R China
[3] South China Univ Technol, State Key Lab Subtrop Bldg Sci, Guangzhou 510006, Peoples R China
[4] Guangdong Prov Key Lab Computat Intelligence & Cy, Guangzhou 510006, Peoples R China
[5] Sun Yat Sen Univ, Sch Informat Management, Guangzhou 510006, Peoples R China
[6] Sun Yat Sen Univ, Dept Psychol, Guangzhou 510006, Peoples R China
[7] Beijing Univ Chem Technol, Sch Informat Sci & Technol, Beijing 100029, Peoples R China
[8] Guilin Univ Elect Technol, Sch Comp Sci & Engn, Guilin 541004, Peoples R China
[9] Victoria Univ, Melbourne, Vic 8001, Australia
基金
中国国家自然科学基金;
关键词
Cloud computing; Task analysis; Optimization; Sociology; Statistics; Processor scheduling; Dynamic scheduling; Adaptive renumber strategy (ARS); dynamic group learning distributed particle swarm optimization (DGLDPSO); dynamic group learning strategy; large-scale cloud workflow scheduling; master-slave multigroup distributed; COOPERATIVE COEVOLUTION; ALGORITHM; STRATEGY;
D O I
10.1109/TCYB.2019.2933499
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Cloud workflow scheduling is a significant topic in both commercial and industrial applications. However, the growing scale of workflow has made such a scheduling problem increasingly challenging. Many current algorithms often deal with small- or medium-scale problems (e.g., less than 1000 tasks) and face difficulties in providing satisfactory solutions when dealing with the large-scale problems, due to the curse of dimensionality. To this aim, this article proposes a dynamic group learning distributed particle swarm optimization (DGLDPSO) for large-scale optimization and extends it for the large-scale cloud workflow scheduling. DGLDPSO is efficient for large-scale optimization due to its following two advantages. First, the entire population is divided into many groups, and these groups are coevolved by using the master-slave multigroup distributed model, forming a distributed PSO (DPSO) to enhance the algorithm diversity. Second, a dynamic group learning (DGL) strategy is adopted for DPSO to balance diversity and convergence. When applied DGLDPSO into the large-scale cloud workflow scheduling, an adaptive renumber strategy (ARS) is further developed to make solutions relate to the resource characteristic and to make the searching behavior meaningful rather than aimless. Experiments are conducted on the large-scale benchmark functions set and the large-scale cloud workflow scheduling instances to further investigate the performance of DGLDPSO. The comparison results show that DGLDPSO is better than or at least comparable to other state-of-the-art large-scale optimization algorithms and workflow scheduling algorithms.
引用
收藏
页码:2715 / 2729
页数:15
相关论文
共 50 条
  • [41] Spark-based parallel dynamic programming and particle swarm optimization via cloud computing for a large-scale reservoir system
    Ma, Yufei
    Zhong, Ping-an
    Xu, Bin
    Zhu, Feilin
    Lu, Qingwen
    Wang, Han
    JOURNAL OF HYDROLOGY, 2021, 598
  • [42] An improved particle swarm optimization algorithm for task scheduling in cloud computing
    Pirozmand P.
    Jalalinejad H.
    Hosseinabadi A.A.R.
    Mirkamali S.
    Li Y.
    Journal of Ambient Intelligence and Humanized Computing, 2023, 14 (04) : 4313 - 4327
  • [43] Survey of Task Scheduling in Cloud Computing based on Particle Swarm Optimization
    Alkayal, Entisar S.
    Jennings, Nicholas R.
    Abulkhair, Maysoon F.
    2017 INTERNATIONAL CONFERENCE ON ELECTRICAL AND COMPUTING TECHNOLOGIES AND APPLICATIONS (ICECTA), 2017, : 263 - 268
  • [44] Compressed-Encoding Particle Swarm Optimization with Fuzzy Learning for Large-Scale Feature Selection
    Yang, Jia-Quan
    Chen, Chun-Hua
    Li, Jian-Yu
    Liu, Dong
    Li, Tao
    Zhan, Zhi-Hui
    SYMMETRY-BASEL, 2022, 14 (06):
  • [45] An agent-assisted heterogeneous learning swarm optimizer for large-scale optimization
    Sun, Yu
    Cao, Han
    SWARM AND EVOLUTIONARY COMPUTATION, 2024, 89
  • [46] Workflow Scheduling in Cloud Computing Environment Using Cat Swarm Optimization
    Bilgaiyan, Saurabh
    Sagnika, Santwana
    Das, Madhabananda
    SOUVENIR OF THE 2014 IEEE INTERNATIONAL ADVANCE COMPUTING CONFERENCE (IACC), 2014, : 680 - 685
  • [47] An Adaptive Multi-Swarm Competition Particle Swarm Optimizer for Large-Scale Optimization
    Kong, Fanrong
    Jiang, Jianhui
    Huang, Yan
    MATHEMATICS, 2019, 7 (06)
  • [48] CenPSO: A Novel Center-based Particle Swarm Optimization Algorithm for Large-scale Optimization
    Mousavirad, Seyed Jalaleddin
    Rahnamayan, Shahryar
    2020 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC), 2020, : 2066 - 2071
  • [49] Progressive Sampling Surrogate-Assisted Particle Swarm Optimization for Large-Scale Expensive Optimization
    Wang, Hong-Rui
    Chen, Chun-Hua
    Li, Yun
    Zhang, Jun
    Zhi-Hui-Zhan
    PROCEEDINGS OF THE 2022 GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE (GECCO'22), 2022, : 40 - 48
  • [50] A Dimension Group-Based Comprehensive Elite Learning Swarm Optimizer for Large-Scale Optimization
    Yang, Qiang
    Zhang, Kai-Xuan
    Gao, Xu-Dong
    Xu, Dong-Dong
    Lu, Zhen-Yu
    Jeon, Sang-Woon
    Zhang, Jun
    MATHEMATICS, 2022, 10 (07)