|本期目录/Table of Contents|

[1]崔 晨,邓赵红*,王士同. 面向单调分类的简洁单调TSK模糊系统[J].南京大学学报(自然科学),2018,54(1):124.[doi:10.13232/j.cnki.jnju.2018.01.014]
 Cui Chen,Deng Zhaohong*,Wang Shitong. Concise monotonic TSK fuzzy system for monotonic classification[J].Journal of Nanjing University(Natural Sciences),2018,54(1):124.[doi:10.13232/j.cnki.jnju.2018.01.014]
点击复制

 面向单调分类的简洁单调TSK模糊系统()
     

《南京大学学报(自然科学)》[ISSN:0469-5097/CN:32-1169/N]

卷:
54
期数:
2018年第1期
页码:
124
栏目:
出版日期:
2018-02-01

文章信息/Info

Title:
 Concise monotonic TSK fuzzy system for monotonic classification
作者:
 崔 晨邓赵红*王士同
 江南大学数字媒体学院,无锡,214122
Author(s):
 Cui ChenDeng Zhaohong*Wang Shitong
 School of Digital Media,Jiangnan University,Wuxi,214122,China
关键词:
 单调分类有序互信息特征选择TSK模糊系统
Keywords:
 monotonic classificationrank mutual informationfeature selectionTSK fuzzy system
分类号:
TP391
DOI:
10.13232/j.cnki.jnju.2018.01.014
文献标志码:
A
摘要:
 Takagi-Sugeno-Kang(TSK)模糊系统的一致逼近能力和可解释性使其可以直观高效地描述复杂的非线性不确定系统,可以有效地应用于模式分类.然而,对于单调分类任务,现有的模糊分类算法并没有考虑单调数据存在的有序关系,因此这些算法对于单调分类任务在模型的复杂度和分类性能方面有待改进.针对此问题,提出了面向单调分类的简洁单调TSK模糊系统建模方法(Concise Monotonic TSK Fuzzy System for Monotonic Classification,CM-TSK-FS),引入有序互信息进行单调特征选择,然后利用抽取的特征来训练TSK模糊系统进行分类识别.该方法有如下优点:(1)由于对单调数据进行了特征选择,新方法降低了TSK模糊系统规则的复杂性,因而得到的模糊系统更加简洁;(2)由于在特征抽取时考虑了单调数据的特征值和决策值之间的单调性,使得训练的模型的分类性能也有了一定程度的提高.在多个单调数据集上进行了实验验证,实验结果表明:面向单调分类的简洁单调TSK模糊系统在处理单调数据集时,通过选取重要的单调数据特征,不仅可以降低其模型的复杂性,还可以提高分类精度.
Abstract:
 Universal approximation and interpretability of Takagi-Sugeno-Kang(TSK)fuzzy systems make it possible to describe complex nonlinear uncertain systems intuitively and efficiently.TSK fuzzy systems have been effectively applied to pattern classification.However,for the monotonic classification task,the existing fuzzy classification algorithms still do not consider the ordinal relationship of the monotonic data adequately,so that the classification performance of these algorithms for monotonic classification tasks in the model needs to be improved.To solve this problem,Concise Monotonic TSK Fuzzy System for Monotonic Classification (CM-TSK-FS) is proposed.It uses rank mutual information for monotonic feature selection,and then uses the extracted features to train TSK fuzzy system for classification and recognition.The proposed method has the following advantages:1)The new method reduces the complexity of TSK fuzzy system due to the feature selection of monotonic data which makes the fuzzy system more concise.2)Since the monotonicity between the features and the decision values of the monotonic data is taken into account when extracting the features,the classification performance of the training model has been improved to a certain extent.The experimental results show that the concise monotonic TSK fuzzy system for monotonic classification,and the simple monotonic TSK fuzzy system for monotonic classification can not only reduce the complexity of the model by selecting important monotonic data features when dealing with monotonic data sets,but also improve the classification accuracy.

参考文献/References:

 [1] Wu G D,Zhu Z W,Huang P H.A TS-type maximizing-discriminability-based recurrent fuzzy network for classification problems.IEEE Transactions on Fuzzy Systems,2011,19(2):339-352.
[2] Potharst R,Feelders A J.Classification trees for problems with monotonicity constraints.ACM Sigkdd Explorations Newsletter,2002,4(1):1-10.
[3] Popova V.Knowledge discovery and monotonicity.Ph.D.Dissertation.The Netherlands:Erasmus University Rotterdam,2004.
[4] Qian Y H,Dang C Y,Liang J Y,et al.Set-valued ordered information systems.Information Sciences,2009,179(16):2809-2832.
[5] Wallenius J,Dyer J S,Fishburn P C,et al.Multiple criteria decision making,multiattribute utility theory:Recent accomplishments and what lies ahead.Management Science,2008,54(7):1336-1349.
[6] Cao-Van K.Supervised ranking,from semantics to algorithms.Ph.D.Dissertation.Ghent:Ghent University,2003.
[7] Kotowski W.Statistical approach to ordinal classification with monotonicity constraint.Ph.D.dissertation.Poznan,Poland:Institute of Computing Science Poznan University of Technology,2008.
[8] Greco S,Matarazzo B,Slowinski R.Rough approximation of a preference relation by dominance relations.European Journal of Operational Research,1999,117(1):63-83.
[9] Greco S,Matarazzo B,Slowinski R.Rough approximation by dominance relations.International Journal of Intelligent Systems,2002,17(2):153-171.
[10] Sai Y,Yao Y Y,Zhong N.Data analysis and mining in ordered information tables ∥ Proceedings of IEEE International Conference on Data Mining.San Jose,CA,USA:IEEE,2001:497-504.
[11] Kotowski W,Dembczyński K,Greco S,et al.Stochastic dominance-based rough set model for ordinal classification.Information Sciences,2008,178(21):4019-4037.
[12] Hu Q H,Yu D R,Guo M Z.Fuzzy preference based rough sets.Information Sciences,2010,180(10):2003-2022.
[13] Hu Q H,Guo M Z,Yu D R,et al.Information entropy for ordinal classification.Science China Information Sciences,2010,53(6):1188-1200.
[14] Ben-David A,Sterling L,Pao Y H.Learning and classification of monotonic ordinal concepts.Computational Intelligence,1989,5(1):45-49.
[15] Ben-David A.Monotonicity maintenance in information-theoretic machine learning algorithms.Machine Learning,1995,19(1):29-43.
[16] Duivesteijn W,Feelders A.Nearest neighbour classification with monotonicity constraints ∥ Proceedings of the 2008 European Conference on Machine Learning and Knowledge Discovery in Databases-Part I.Springer Berlin Heidelberg,2008:301-316.
[17] Barile N,Feelders A J.Nonparametric monotone classification with MOCA ∥ Proceedings of the 8th IEEE International Conference on Data Mining.Pisa,Italy:IEEE,2008:731-736.
[18] Feelders A,Pardoel M.Pruning for monotone classification trees ∥ Proceedings of the 5th International Symposium on Intelligent Data Analysis.Springer Berlin Heidelberg,2003:1-12.
[19] Cao-Van K,De Baets B.Growing decision trees in an ordinal setting.International Journal of Intelligent Systems,2003,18(7):733-750.
[20] van de Kamp R,Feelders A,Barile N.Isotonic classification trees ∥ Proceedings of the 8th International Symposium on Intelligent Data Analysis.Springer Berlin Heidelberg,2009:405-416.
[21] Zhao B,Wang F,Zhang C S.Block-quantized support vector ordinal regression.IEEE Transactions on Neural Networks,2009,20(5):882-890.
[22] Li S T,Chen C C.A regularized monotonic fuzzy support vector machine model for data mining with prior knowledge.IEEE Transactions on Fuzzy Systems,2015,23(5):1713-1727.
[23] Sun B Y,Li J Y,Wu D D,et al.Kernel discriminant learning for ordinal regression.IEEE Transactions on Knowledge and Data Engineering,2010,22(6):906-910.
[24] Guyon I,Elisseeff A.An introduction to variable and feature selection.Journal of Machine Learning Research,2003,3:1157-1182.
[25] Liu H,Yu L.Toward integrating feature selection algorithms for classification and clustering.IEEE Transactions on Knowledge and Data Engineering,2005,17(4):491-502.
[26] 邓赵红,张江滨,蒋亦樟等.基于模糊子空间聚类的0阶岭回归TSK模糊系统.控制与决策,2016,31(5):882-888.(Deng Z H,Zhang J B,Jiang Y Z,et al.Fuzzy subspace clustering based 0-order ridge regression TSK fuzzy system.Control and Decision,2016,31(5):882-888.)
[27] Kamishima T,Akaho S.Dimension reduction for supervised ordering ∥ Proceedings of the 6th International Conference on Data Mining.Hong Kong,China:IEEE,2006:330-339.
[28] Baccianella S,Esuli A,Sebastiani F.Feature selection for ordinal regression ∥ Proceedings of 2010 ACM Symposium on Applied Computing.Sierre,Switzerland:ACM,2010:1748-1754.
[29] Battiti R.Using mutual information for selecting features in supervised neural net learning.IEEE Transactions on Neural Networks,1994,5(4):537-550.
[30] Deng Z H,Choi K S,Chung F L,et al.Scalable TSK fuzzy modeling for very large datasets using minimal-enclosing-ball approximation.IEEE Tran-sactions on Fuzzy Systems,2011,19(2):210-226.
[31] Won J M,Park S Y,Lee J S.Parameter conditions for monotonic Takagi-Sugeno-Kang fuzzy system.Fuzzy Sets and Systems,2002,132(2):135-146.
[32] Hall L O,Goldgof D B.Convergence of the single-pass and online fuzzy C-means algorithms.IEEE Transactions on Fuzzy Systems,2011,19(4):792-794.
[33] Zhu L,Chung F L,Wang S T.Generalized fuzzy C-means clustering algorithm with improved fuzzy partitions.IEEE Transactions on Systems,Man,and Cybernetics,Part B(Cybernetics),2009,39(3):578-591.
[34] Peng H C,Long F H,Ding C.Feature selection based on mutual information criteria of max-dependency,max-relevance,and min-redundancy.IEEE Transactions on Pattern Analysis and Machine Intelligence,2005,27(8):1226-1238.
[35] Vinh L T,Thang N D,Lee Y K.An improved maximum relevance and minimum redundancy feature selection algorithm based on normalized mutual information ∥ Proceedings of the 2010 10th IEEE/IPSJAnnual International Symposium on Applications and the Internet.Seoul,Korea:IEEE Computer Society,2010:395-398.
[36] Yu D R,An S,Hu Q H.Fuzzy mutual information based min-redundancy and max-relevance heterogeneous feature selection.International Journal of Computational Intelligence Systems,20121,4(4):619-633.
[37] Hoerl A E,Kennard R W.Ridge regression:Biased estimation for nonorthogonal problems.Technometrics,19702000,4212(1):55-67.

相似文献/References:

备注/Memo

备注/Memo:
 基金项目:江苏省杰出青年基金(60903098),国家重点研发计划项目(2016YFB0800803),国家自然科学基金(61403247),扬帆计划(14YF1411000)
收稿日期:2017-12-23
*通讯联系人,E-mail:dengzhaohong@jiangnan.edu.cn
更新日期/Last Update: 2018-01-31