More Than Accuracy: A Composite Learning Framework for Interval Type-2 Fuzzy Logic Systems

被引:18
作者
Beke, Aykut [1 ]
Kumbasar, Tufan [1 ]
机构
[1] Istanbul Tech Univ, Control & Automat Engn Dept, TR-34469 Istanbul, Turkiye
关键词
Deep learning (DL); interval type-2 fuzzy logic systems (IT2-FLS); parameterization tricks; quantile regression (QR); uncertainty; SUPPORT-VECTOR REGRESSION; NEURAL-NETWORKS; REGULARIZATION; OPTIMIZATION; DROPRULE; MODELS;
D O I
10.1109/TFUZZ.2022.3188920
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this article, we propose a novel composite learning framework for interval type-2 (IT2) fuzzy logic systems (FLSs) to train regression models with a high accuracy performance and capable of representing uncertainty. In this context, we identify three challenges, first, the uncertainty handling capability, second, the construction of the composite loss, and third, a learning algorithm that overcomes the training complexity while taking into account the definitions of IT2-FLSs. This article presents a systematic solution to these problems by exploiting the type-reduced set of IT2-FLS via fusing quantile regression and deep learning (DL) with IT2-FLS. The uncertainty processing capability of IT2-FLS depends on employed center-of-sets calculation methods, while its representation capability is defined via the structure of its antecedent and consequent membership functions. Thus, we present various parametric IT2-FLSs and define the learnable parameters of all IT2-FLSs alongside their constraints to be satisfied during training. To construct the loss function, we define a multiobjective loss and then convert it into a constrained composite loss composed of the log-cosh loss for accuracy purposes and a tilted loss for uncertainty representation, which explicitly uses the type-reduced set. We also present a DL approach to train IT2-FLS via unconstrained optimizers. In this context, we present parameterization tricks for converting the constraint optimization problem of IT2-FLSs into an unconstrained one without violating the definitions of fuzzy sets. Finally, we provide comprehensive comparative results for hyperparameter sensitivity analysis and an inter/intramodel comparison on various benchmark datasets.
引用
收藏
页码:734 / 744
页数:11
相关论文
共 44 条
[1]   Type-2 fuzzy neural networks with fuzzy clustering and differential evolution optimization [J].
Aliev, Rafik A. ;
Pedrycz, Witold ;
Guirimov, Babek G. ;
Aliev, Rashad R. ;
Ilhan, Umit ;
Babagil, Mustafa ;
Mammadli, Sadik .
INFORMATION SCIENCES, 2011, 181 (09) :1591-1608
[2]   Robust twin support vector regression based on Huber loss function [J].
Balasundaram, S. ;
Prasad, Subhash Chandra .
NEURAL COMPUTING & APPLICATIONS, 2020, 32 (15) :11285-11309
[3]   Type-2 Fuzzy Logic-Based Linguistic Pursuing Strategy Design and Its Deployment to a Real-World Pursuit Evasion Game [J].
Beke, Aykut ;
Kumbasar, Tufan .
IEEE TRANSACTIONS ON CYBERNETICS, 2020, 50 (01) :211-221
[4]   Learning with Type-2 Fuzzy activation functions to improve the performance of Deep Neural Networks [J].
Beke, Aykut ;
Kumbasar, Tufan .
ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2019, 85 :372-384
[5]  
Beke T., 2021, PROC IEEE INT C FUZZ, P1
[6]  
Bolat T., 2021, PROC IEEE INT C FUZZ, P1
[7]   An approximation to solve regression problems with a genetic fuzzy rule ordinal algorithm [J].
Carlos Gamez, Juan ;
Garcia, David ;
Gonzalez, Antonio ;
Perez, Raul .
APPLIED SOFT COMPUTING, 2019, 78 :13-28
[8]   Review on Fuzzy and Neural Prediction Interval Modelling for Nonlinear Dynamical Systems [J].
Cartagena, Oscar ;
Parra, Sebastian ;
Munoz-Carpintero, Diego ;
Marin, Luis G. ;
Saez, Doris .
IEEE ACCESS, 2021, 9 :23357-23384
[9]   Application of interval type-2 fuzzy neural networks in non-linear identification and time series prediction [J].
Castillo, Oscar ;
Castro, Juan R. ;
Melin, Patricia ;
Rodriguez-Diaz, Antonio .
SOFT COMPUTING, 2014, 18 (06) :1213-1224
[10]  
Chen C, 2016, IEEE INT FUZZY SYST, P602, DOI 10.1109/FUZZ-IEEE.2016.7737742