An investigation of complex fuzzy sets for large-scale learning

被引:13
作者
Sobhi, Sayedabbas [1 ]
Dick, Scott [1 ]
机构
[1] Univ Alberta, Dept Elect & Comp Engn, 11 th Flr DICE Bldg,9211-116 St, Edmonton, AB T6G 1H9, Canada
关键词
Complex fuzzy sets; Complex fuzzy logic; Machine learning; Neuro-fuzzy systems; Randomized learning; Time series forecasting; STOCHASTIC CONFIGURATION NETWORKS; FUNCTION APPROXIMATION; SYSTEM; PREDICTION; ALGORITHMS; MODEL;
D O I
10.1016/j.fss.2023.108660
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Complex fuzzy sets are an extension of type-1 fuzzy sets with complex-valued membership functions. Over the last 20 years, time-series forecasting has emerged as the most important application of complex fuzzy sets, with neuro-fuzzy systems employing them shown to be accurate and compact forecasting models. In the complex fuzzy sets literature, two dominant approaches to designing forecasters can be observed: sinusoidal membership functions versus complex-valued Gaussian membership functions. To date, however, there has never been a systematic investigation that compares the performance of these two membership types (or their combination) within a common architecture.We propose a new neuro-fuzzy architecture using complex fuzzy sets that has been designed for large-scale learning problems. This architecture employs randomized learning to speed up network training. In designing this architecture, we empirically compared sinusoidal complex fuzzy sets and complex Gaussian fuzzy sets. Across multiple variations of the architecture, we find that the complex Gaussian fuzzy sets lead to significantly more accurate forecasts on moderate-to-large time series datasets, while still keeping the overall size of the network compact.& COPY; 2023 Elsevier B.V. All rights reserved.
引用
收藏
页数:21
相关论文
共 50 条
  • [41] Robust learning of large-scale fuzzy cognitive maps via the lasso from noisy time series
    Wu, Kai
    Liu, Jing
    KNOWLEDGE-BASED SYSTEMS, 2016, 113 : 23 - 38
  • [42] Separable N-soft sets: A tool for multinary descriptions with large-scale parameter sets
    Khan, Muhammad Jabir
    Alcantud, Jose Carlos R.
    Akram, Muhammad
    Ding, Weiping
    APPLIED INTELLIGENCE, 2025, 55 (06)
  • [43] Compressed constrained spectral clustering framework for large-scale data sets
    Liu, Wenfen
    Ye, Mao
    Wei, Jianghong
    Hu, Xuexian
    KNOWLEDGE-BASED SYSTEMS, 2017, 135 : 77 - 88
  • [44] Large-Scale Machine Learning for Business Sector Prediction
    Angenent, Mitch N.
    Barata, Antonio Pereira
    Takes, Frank W.
    PROCEEDINGS OF THE 35TH ANNUAL ACM SYMPOSIUM ON APPLIED COMPUTING (SAC'20), 2020, : 1143 - 1146
  • [45] A Progressive Learning Strategy for Large-Scale Glacier Mapping
    Xie, Zhiyuan
    Haritashya, Umesh K.
    Asari, Vijayan K.
    IEEE ACCESS, 2022, 10 : 72615 - 72627
  • [46] Toward Robust Anxiety Biomarkers: A Machine Learning Approach in a Large-Scale Sample
    Boeke, Emily A.
    Holmes, Avram J.
    Phelps, Elizabeth A.
    BIOLOGICAL PSYCHIATRY-COGNITIVE NEUROSCIENCE AND NEUROIMAGING, 2020, 5 (08) : 799 - 807
  • [47] Compressed linear algebra for large-scale machine learning
    Ahmed Elgohary
    Matthias Boehm
    Peter J. Haas
    Frederick R. Reiss
    Berthold Reinwald
    The VLDB Journal, 2018, 27 : 719 - 744
  • [48] A review of Nystrom methods for large-scale machine learning
    Sun, Shiliang
    Zhao, Jing
    Zhu, Jiang
    INFORMATION FUSION, 2015, 26 : 36 - 48
  • [49] Compressed linear algebra for large-scale machine learning
    Elgohary, Ahmed
    Boehm, Matthias
    Haas, Peter J.
    Reiss, Frederick R.
    Reinwald, Berthold
    VLDB JOURNAL, 2018, 27 (05) : 719 - 744
  • [50] On Learning Semantic Representations for Large-Scale Abstract Sketches
    Xu, Peng
    Huang, Yongye
    Yuan, Tongtong
    Xiang, Tao
    Hospedales, Timothy M.
    Song, Yi-Zhe
    Wang, Liang
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2021, 31 (09) : 3366 - 3379