background image
2
Background Theory
The basic idea of Support Vector Machine classification is to find an optimal
maximal margin separating hyperplane between two classes. Support Vector
Machines uses an implicit nonlinear mapping from input-space to a higher di-
mensional feature-space using kernel-functions, in order to find a hyperplane of
problems which are not linear separable in input-space [7, 18]. Classifying multi-
ple classes is commonly performed by combining several binary SVM classifiers
in a tournament manner, either one-against-all or one-against-one, the latter
approach requiring substantial more computational effort [11].
The standard binary SVM classification problem with soft margin (allowing
some errors) is shown visually in Fig. 1(a). Intuitively, the problem is to maximize
the margin between the solid planes and at the same time permit as few errors
as possible, errors being positive class points on the negative side (of the solid
line) or vice versa.
1
'
+
=
w
x
O
O
O
O
O
O
O
O
w
2
=
1
'
-
=
w
x
Margin
A+
A-
O
O
O
O O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
O
X
O
O
X
X
X
=
w
x'
Separating Plane
w
X
(a) Standard SVM
classifier
1
'
+
=
w
x
O
O
O
O
O
O
O
O


=
w
2
1
'
-
=
w
x
Margin
A+
A-
O
O
O
O O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
O
X
O
O
X
X
X
0
'
=
-
w
x
Separating Plane


w
X
(b)
Proximal
SVM
classifier
Fig. 1. SVM and PSVM
The standard SVM problem can be stated as a quadratic optimization prob-
lem with constraints, as shown in (1).
min
(w,,y)R
n+1+m
{ve y +
1
2
w w}
s.t. D(Aw - e) + y e
y 0
(1)
A R
mn
, D {-1, +1}
m1
, e = 1
m1
Fung and Mangasarian [8] replaced the inequality constraint in (1) with an
equality constraint. This changed the binary classification problem, because the
points in Fig. 1(b) are no longer bounded by the planes, but are clustered around
Paper G
117

<< - < - > - >>