Skip to content
Snippets Groups Projects
Commit f7b9686a authored by Fabian Pedregosa's avatar Fabian Pedregosa
Browse files

More doc for the svm module.

Added a little reference for OneClassSVM and some examples.

From: Fabian Pedregosa <fabian.pedregosa@inria.fr>

git-svn-id: https://scikit-learn.svn.sourceforge.net/svnroot/scikit-learn/trunk@563 22fbfee3-77ab-4535-9bad-27d1bd3bc7d8
parent b3f4268f
Branches
Tags
No related merge requests found
......@@ -80,16 +80,14 @@ data close to the model prediction.
Distribution estimation
=======================
One-class SVM was proposed by Scholkopf et al. (2001) for estimating
the support of a high-dimensional distribution. Given training vectors
:math:`x_i \in \mathbb{R}^n, i=1, .., l` without any class
information, the primal form is:
One-class SVM is used for outlayer detection, that is, given a set of
samples, it will detect the soft boundary of that set.
.. math:: \min_{w, b, \xi} {1 \over 2} w^T w - \rho + {1 \over \nu l} \sum_{i=1}^l \xi_i
.. linteralinclude:: ../../examples/plot_svm_oneclass.py
\textrm{subject to} w^T \phi(x_i) \geq \rho - \xi_i
.. image:: data_svm/oneclass.png
\xi_i \geq 0, i=1,...,l
.. autoclass:: scikits.learn.svm.OneClassSVM
Scaling
=======
......
doc/modules/svm_data/oneclass.png

28.2 KiB

doc/modules/svm_data/separating_hyperplane_2D.png

24.4 KiB | W: | H:

doc/modules/svm_data/separating_hyperplane_2D.png

38.5 KiB | W: | H:

doc/modules/svm_data/separating_hyperplane_2D.png
doc/modules/svm_data/separating_hyperplane_2D.png
doc/modules/svm_data/separating_hyperplane_2D.png
doc/modules/svm_data/separating_hyperplane_2D.png
  • 2-up
  • Swipe
  • Onion skin
doc/modules/svm_data/separating_nonlinear.png

66.5 KiB | W: | H:

doc/modules/svm_data/separating_nonlinear.png

68 KiB | W: | H:

doc/modules/svm_data/separating_nonlinear.png
doc/modules/svm_data/separating_nonlinear.png
doc/modules/svm_data/separating_nonlinear.png
doc/modules/svm_data/separating_nonlinear.png
  • 2-up
  • Swipe
  • Onion skin
......@@ -13,12 +13,22 @@ clf.fit(X, Y)
# get the separating hyperplane
w = np.dot(clf.coef_[0], clf.support_)
xx = np.linspace(-2, 2)
yy = (clf.rho_[0] - w[0]*xx)/w[1]
a = -w[0]/w[1]
xx = np.linspace(-5, 5)
yy = a*xx + (clf.rho_[0])/w[1]
# plot the paralels to the separating hyperplane that pass through the
# support vectors
b = clf.support_[0]
yy_down = a*xx + (b[1] - a*b[0])
b = clf.support_[-1]
yy_up = a*xx + (b[1] - a*b[0])
# plot the line, the points, and the nearest vectors to the plane
pl.set_cmap(pl.cm.Paired)
pl.plot(xx, yy, 'k--')
pl.plot(xx, yy, 'k-')
pl.plot(xx, yy_down, 'k--')
pl.plot(xx, yy_up, 'k--')
pl.scatter(X[:,0], X[:,1], c=Y)
pl.scatter(clf.support_[:,0], clf.support_[:,1], marker='+')
......
......@@ -3,7 +3,8 @@ import numpy as np
import pylab as pl
from scikits.learn import svm
xx, yy = np.meshgrid(np.linspace(-5, 5), np.linspace(-5, 5))
xx, yy = np.meshgrid(np.linspace(-5, 5, 500), np.linspace(-5, 5, 500))
np.random.seed(0)
X = np.random.randn(300, 2)
Y = np.logical_xor(X[:,0]>0, X[:,1]>0)
......
import numpy as np
import pylab as pl
from scikits.learn import svm
xx, yy = np.meshgrid(np.linspace(-5, 5, 500), np.linspace(-5, 5, 500))
X = np.random.randn(100, 2)
Y = [0]*100
# fit the model
clf = svm.OneClassSVM(nu=0.5)
clf.fit(X, Y)
# plot the line, the points, and the nearest vectors to the plane
Z = clf.predict(np.c_[xx.ravel(), yy.ravel()])
Z = Z.reshape(xx.shape)
pl.set_cmap(pl.cm.Paired)
pl.pcolormesh(xx, yy, Z)
pl.scatter(X[:,0], X[:,1], c=Y)
pl.scatter(clf.support_[:,0], clf.support_[:,1], c='black')
pl.axis('tight')
pl.show()
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment