Article Preview
Top1. Introduction
Support vector machine (SVM) presented by Vapnik and co-worker (Ding, 2012) is a computationally powerful kernel-based tool for binary data classification and regression. Because its theory is based on the idea of structural risk minimization principle, SVM has successfully solved the high dimensionality and local minimum problems. Therefore, compared with other machine learning methods, such as artificial neural network (Adriana, 2020; Jing, 2013), SVM owns better generalization ability. Within a few years after its introduction SVM has played excellent performance on many real-world predictive data mining applications such as text categorization (Liu, 2020), time series prediction (Chen, 2012), pattern recognition (Tang, 2020) and image processing (Lo, 2012), etc.
However, the computational complexity of SVM in training stage is too expensive, i.e.,, where is the total size of the training samples. To overcome this problem, so far, many improved algorithms for reducing the computational complexity of SVM have been presented, such as chunking algorithm (Cortes, 1995), decomposition algorithm (Osuna, 1997) and sequential minimal optimization (SMO) (Platt, 1999), etc. On the other hand, many researchers have proposed some deformation algorithms based on the standard SVM. For example, in 2006, Mangasarian et al. (Mangasarian, 2006) proposed a nonparallel plane classifier for binary data classification, named the generalized eigenvalue proximal support vector machine (GEPSVM). The essence of GEPSVM is to look for two nonparallel planes, so that data points of each class are proximal to one of them. GEPSVM has good learning speed because it solves two generalized eigenvalue problems of the order of input space dimension, but its classification accuracy is low. In 2007, Jayadeva et al. (Jayadeva, 2007) proposed a new machine learning method called twin support vector machine (TWSVM) for the binary classification in the spirit of GEPSVM. TWSVM would generate two non-parallel planes, such that each plane is closer to one of the two classes and is as far as possible from the other. In TWSVM, a pair of smaller sized quadratic programming problems (QPPs) are solved, instead of solving single large one in SVM, makes the computational speed of TWSVM approximately 4 times faster than the traditional SVM. At present, TWSVM has become one of the popular methods because of its low computational complexity. So far, many variants of TWSVM have been proposed by Wang (2013), Shao (2013) and Peng (2013). Certainly, TWSVM is suitable to the classification problems.