doi:

DOI: 10.3724/SP.J.1087.2009.00833

Journal of Computer Applications (计算机应用) 2009/29:3 PP.833-835

Multi-feature fusion method based on support vector machine and k-nearest neighbor classifier


Abstract:
The traditional classification methods only use one single classifier, which may lead to one-sidedness, low accuracy, and that the samples nearby the Support Vector Machine (SVM) hyperplanes are more easily misclassified. To solve these problems, the multi-feature fusion method based on SVM and K-Nearest Neighbor (KNN) classifiers was presented in this paper. Firstly, the features were divided into L groups and the SVM hyperplanes were constructed for each feature of training set. Secondly, the testing set was tested by SVM-KNN method, and the decision profile matrixes were obtained. Finally, these decision profile matrixes were implemented by multi-feature fusion method. The experimental results on Iris data show that the forecast accuracy of the multi-feature fusion method based on SVM-KNN classifiers increases by 28.7% and 1.9% than those of SVM and SVM-KNN methods respectively.

Key words:Support Vector Machine (SVM),K-Nearest Neighbor (KNN) algorithm,multi-feature fusion,inverse probability

ReleaseDate:2014-07-21 14:31:31



[1]BATTITI R, COLLA M. Democracy in neural nets: Voting schemes for classification [J]. Neural Networks, 1994, 7(4): 691-707.

[2]HO T K, HULL J J, SRIHARI S N. Decision combination in multiple classifier systems[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1994,16(1): 66-75.

[3]CHO S B, JIN H K. Combining multiple neural networks by fuzzy integral for robust classification[J]. IEEE Transactions on Systems, Man and Cybernetics, 1995, 25(2): 380-384.

[4]CHO S B, JIN H K. Multiple network fusion using fuzzy logic[J]. IEEE Transactions on Neural Networks, 1995,6(2):497-501.

[5]KITTLER J, HATEF M, DUIN R P W, et al. On combining classifiers[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1998, 20(3): 226-239.

[6]邓乃扬,田英杰. 数据挖掘中的新方法——支持向量机[M]. 北京:科学出版社, 2004: 45-76.

[7]LI R, YE S W, SHI Z Z. SVM-kNN classifier - A new method of improving the accuracy of SVM classifier[J]. Acta Electronica Sinica, 2002,30(5):745-748.

[8]施建宇,潘泉,张绍武,等. 基于多特征融合的蛋白质折叠子预测[J]. 北京生物医学工程, 2006, 25(5): 482-485.

[9]PLATT J C. Probabilistic outputs for support vector machines and comparison to regularized likelihood methods [C]// Advances in Large Margin Classifiers. Cambridge, MA: MIT Press, 2000:61-74.

[10]娄震, 金忠, 杨静宇. 基于类条件置信变换的后验概率估计方法[J]. 计算机学报, 2005, 28(1): 19-24.

[11]RICHARD O D, PETER E H, DAVID G S. Pattern classification[M].李宏东, 姚天翔, 等译. 北京: 机械工业出版社, 2003: 151-158.

[12]业宁, 王迪, 窦立君. 信息熵与支持向量的关系[J]. 广西师范大学学报: 自然科学版, 2006, 24(4): 127-130.

PDF