Back to Search Start Over

Fast Training of Adversarial Deep Fuzzy Classifier by Downsizing Fuzzy Rules With Gradient Guided Learning.

Authors :
Suhang, Gu
Vong, Chi Man
Wong, Pak Kin
Wang, Shitong
Source :
IEEE Transactions on Fuzzy Systems; Jun2022, Vol. 30 Issue 6, p1967-1980, 14p
Publication Year :
2022

Abstract

While our recent deep fuzzy classifier DSA-FC, which stacks adversarial interpretable Takagi–Sugeno–Kang fuzzy subclassifiers, shares its promising classification, its training speed will become very slow and even intolerable for large-scale datasets, due to successive training on all training samples with their random gradient based updates along each layer of its stacked structure. In order to circumvent this bottleneck issue, a fast training algorithm FTA is developed in this study by downsizing fuzzy rules with the proposed gradient guided learning for each subclassifier at each layer of DSA-FC on large-scale datasets. The core of FTA is to assure fast training of each subclassifier at each layer of DSA-FC, which first generates first-order smooth gradient guided information by means of the proposed top-k fuzzy rules selected from all fuzzy rules in each subclassifier, and then quickly updates the current inputs in terms of such information, which will be taken as the inputs of the subclassifier at the next layer. Our theoretical analysis reveals that the proposed gradient guided learning indeed enhances the generalization capability of a deep fuzzy classifier with or without adversarial attacks on outputs. Experimental results on large datasets demonstrate that FTA indeed trains the deep fuzzy classifier DSA-FC quickly with enhanced generalization capability. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
10636706
Volume :
30
Issue :
6
Database :
Complementary Index
Journal :
IEEE Transactions on Fuzzy Systems
Publication Type :
Academic Journal
Accession number :
157228520
Full Text :
https://doi.org/10.1109/TFUZZ.2021.3072498