diff --git a/doc/conf.py b/doc/conf.py
index ef1dbf10b48b1b6eb968d927c7469b09521de316..e92d1c4fde53acd7a7c109e168d9b44d7721c653 100644
--- a/doc/conf.py
+++ b/doc/conf.py
@@ -25,6 +25,8 @@ sys.path.insert(0, os.path.abspath('sphinxext'))
 extensions = ['sphinx.ext.autodoc', 'sphinx.ext.autosummary', 'numpydoc', 'sphinx.ext.pngmath',
               'gen_rst']
 
+autosummary_generate=True
+
 # Add any paths that contain templates here, relative to this directory.
 templates_path = ['_templates']
 
diff --git a/doc/index.rst b/doc/index.rst
index 07b486a5fc5ed5fd8ba3f463e0c92f5d93a5c9b5..2326a33fbdf22d41889ed29191531acf7cd9e562 100644
--- a/doc/index.rst
+++ b/doc/index.rst
@@ -47,7 +47,7 @@ Scikits.learn: machine learning in Python
 .. raw:: html
 
     <small>
-    <img src="_images/plot_digits_classification.png" align="right" style="width: 400px"/>
+    <a href="auto_examples/plot_digits_classification.html"><img src="_images/plot_digits_classification.png" align="right" style="width: 400px"/></a>
 
 :ref:`A simple Example: recognizing hand-written digits <example_plot_digits_classification.py>` ::
 
@@ -118,5 +118,9 @@ User guide
    supervised_learning
    unsupervised_learning
    model_selection
+
+.. toctree::
+   :maxdepth: 2
+
    auto_examples/index
    developers/index
diff --git a/doc/modules/classes.rst b/doc/modules/classes.rst
new file mode 100644
index 0000000000000000000000000000000000000000..9a028b2a66720e5aca9ab47693e95507027810e4
--- /dev/null
+++ b/doc/modules/classes.rst
@@ -0,0 +1,13 @@
+SVM
+===
+
+.. currentmodule:: scikits.learn
+
+.. autosummary::
+   :toctree: generated/
+
+   svm.SVC
+   svm.LinearSVC
+   svm.NuSVC
+   svm.SVR
+   svm.NuSVR
diff --git a/doc/modules/svm.rst b/doc/modules/svm.rst
index ff6f0c8f95b4236771094be8bb6932918c34ce7a..cf96d04a804fd01313c5275ad286702218179fc7 100644
--- a/doc/modules/svm.rst
+++ b/doc/modules/svm.rst
@@ -2,12 +2,14 @@
 Support Vector Machines
 =======================
 
+.. currentmodule:: scikits.learn.svm
+
 **Support vector machines (SVMs)** are a set of supervised learning
 methods used for classification, regression and outlayer detection.
 
 The advantages of Support Vector Machines are:
 
-    - Effective high dimensional spaces.
+    - Effective in high dimensional spaces.
 
     - Still effective in cases where number of dimensions is greater
       than the number of samples.
@@ -36,41 +38,79 @@ Classification
 
 Suppose some given data points each belong to one of N classes, and
 the goal is to decide which class a new data point will be in. The
-classes that permform this task are SVC, NuSVC and LinearSVC.
+classes that permform this task are :class:`SVC`, :class:`NuSVC` and
+:class:`LinearSVC`.
 
-SVC and NuSVC are similar methods, but accept slightly different set
-of parameters and have different mathematical formulations (see
-section :ref:`svm_mathematical_formulation`). On the other hand,
-LinearSVC is another implemntation of SVC optimized in the case of a
-linear kernel. Note that other classes this one does not accept
-keyword 'kernel', as this is assumed to be linear. It also lacks some
-of the memebrs of SVC and NuSVC, like support\_.
+:class:`SVC` and :class:`NuSVC` are similar methods, but accept
+slightly different set of parameters and have different mathematical
+formulations (see section :ref:`svm_mathematical_formulation`). On the
+other hand, :class:`LinearSVC` is another implementation of SVC
+optimized in the case of a linear kernel. Note that :class:`LinearSVC`
+does not accept keyword 'kernel', as this is assumed to be linear. It
+also lacks some of the memebrs of SVC and NuSVC, like support\_.
 
 
 .. figure:: ../auto_examples/svm/images/plot_iris.png
+   :target: ../auto_examples/svm/plot_iris.html
+   :align: center
+
+
+
+
+Regression
+==========
+
+The method of Support Vector Classification can be extended to solve
+regression problems. This method is called Support Vector Regression.
+
+The model produced by support vector classification (as described
+above) depends only on a subset of the training data, because the cost
+function for building the model does not care about training points
+that lie beyond the margin. Analogously, the model produced by Support
+Vector Regression depends only on a subset of the training data,
+because the cost function for building the model ignores any training
+data close to the model prediction.
+
+
+There are two flavours of Support Vector Regression: :class:`SVR` and
+:class:`NuSVR`.
+
+
+Density estimation
+=======================
+
+One-class SVM is used for outliers detection, that is, given a set of
+samples, it will detect the soft boundary of that set so as to
+classify new points as belonging to that set or not. The class that
+implement this is called :class:`OneClassSVM`
+
+
+.. note::
+
+    For a complete example on one class SVM see 
+    :ref:`example_svm_plot_oneclass.py` example.
+
+.. figure:: ../auto_examples/svm/images/plot_oneclass.png
+   :target: ../auto_examples/svm/plot_oneclass.html
    :align: center
    :scale: 50
 
 
-**Complete class reference:**
+Examples
+========
 
-.. autoclass:: scikits.learn.svm.SVC
-   :members:
-   :inherited-members:
+See :ref:`svm_examples` for a complete list of examples.
 
-.. autoclass:: scikits.learn.svm.NuSVC
-   :members:
-   :inherited-members:
+Examples
+--------
 
+For a complete example of custom kernels see
+:ref:`example_svm_plot_custom_kernel.py`. 
 
-The following class 
-.. autoclass:: scikits.learn.svm.LinearSVC
-   :members:
-   :inherited-members:
 
 
 Using Custom Kernels
---------------------
+====================
 
 .. TODO: this is not restricted to classification
 
@@ -101,69 +141,6 @@ classifiers, except that:
       between the use of fit() and predict() you will have
       unexpected results.
 
-.. note::
-
-Examples
---------
-
-For a complete example of custom kernels see
-:ref:`example_svm_plot_custom_kernel.py`. 
-
-.. TODO: precomputed kernels.
-
-Regression
-==========
-
-The method of Support Vector Classification can be extended to solve
-the regression problem. This method is called Support Vector
-Regression.
-
-The model produced by support vector classification (as described
-above) depends only on a subset of the training data, because the cost
-function for building the model does not care about training points
-that lie beyond the margin. Analogously, the model produced by Support
-Vector Regression depends only on a subset of the training data,
-because the cost function for building the model ignores any training
-data close to the model prediction.
-
-____
-
-.. autoclass:: scikits.learn.svm.SVR
-   :members:
-   :inherited-members:
-
-.. autoclass:: scikits.learn.svm.NuSVR
-   :members:
-   :inherited-members:
-
-Density estimation
-=======================
-
-One-class SVM is used for outliers detection, that is, given a set of
-samples, it will detect the soft boundary of that set so as to classify
-new points as belonging to that set or not.
-
-____
-
-.. autoclass:: scikits.learn.svm.OneClassSVM
-   :members:
-   :inherited-members:
-
-.. note::
-
-    For a complete example on one class SVM see 
-    :ref:`example_svm_plot_svm_oneclass.py` example.
-
-.. figure:: ../auto_examples/svm/images/plot_svm_oneclass.png
-   :align: center
-   :scale: 50
-
-
-Examples
-========
-
-See :ref:`svm_examples` for a complete list of examples.
-
 
 Support Vector machines for sparse data
 =======================================
@@ -172,20 +149,22 @@ There is support for sparse data given in any matrix in a format
 supported by scipy.sparse. See module scikits.learn.sparse.svm.
 
 
+
+
 Tips on Practical Use
 =====================
 
   * Support Vector Machine algorithms are not scale-invariant, so it
-  is highly recommended to scale your data. For example, scale each
-  attribute on the input vector X to [0,1] or [-1,+1], or standarize
-  it to have mean 0 and variance 1. Note that the *same* scaling must
-  be applied to the test vector to obtain meaningful results. See `The
-  CookBook
-  <https://sourceforge.net/apps/trac/scikit-learn/wiki/CookBook>`_ for
-  some examples on scaling.
+    is highly recommended to scale your data. For example, scale each
+    attribute on the input vector X to [0,1] or [-1,+1], or standarize
+    it to have mean 0 and variance 1. Note that the *same* scaling
+    must be applied to the test vector to obtain meaningful
+    results. See `The CookBook
+    <https://sourceforge.net/apps/trac/scikit-learn/wiki/CookBook>`_
+    for some examples on scaling.
 
   * nu in nu-SVC/one-class-SVM/nu-SVR approximates the fraction of
-  training errors and support vectors.
+    training errors and support vectors.
 
   * If data for classification are unbalanced (e.g. many positive and
     few negative), try different penalty parameters C.
diff --git a/doc/modules/svm_data/example_plot.png b/doc/modules/svm_data/example_plot.png
deleted file mode 100644
index 99a87d6a83b284dc87e325cc82349c1c05b206f4..0000000000000000000000000000000000000000
Binary files a/doc/modules/svm_data/example_plot.png and /dev/null differ
diff --git a/doc/modules/svm_data/oneclass.png b/doc/modules/svm_data/oneclass.png
deleted file mode 100644
index 4b621507ccee0c6ab0c19f80ef677d9abbe2ef45..0000000000000000000000000000000000000000
Binary files a/doc/modules/svm_data/oneclass.png and /dev/null differ
diff --git a/doc/modules/svm_data/separating_hyperplane_2D.png b/doc/modules/svm_data/separating_hyperplane_2D.png
deleted file mode 100644
index 128280ac2a78cc5b7301f122645a9e345d6ff552..0000000000000000000000000000000000000000
Binary files a/doc/modules/svm_data/separating_hyperplane_2D.png and /dev/null differ
diff --git a/doc/modules/svm_data/separating_nonlinear.png b/doc/modules/svm_data/separating_nonlinear.png
deleted file mode 100644
index bb4cf3b579b8e433f96e46076e0c224481aff9f2..0000000000000000000000000000000000000000
Binary files a/doc/modules/svm_data/separating_nonlinear.png and /dev/null differ
diff --git a/examples/svm/README.txt b/examples/svm/README.txt
index 3b7dce4ff19839634f95626f9cf029a269a19863..9c83e641b5b682049eba33af979d7e89d08a0e44 100644
--- a/examples/svm/README.txt
+++ b/examples/svm/README.txt
@@ -1,7 +1,7 @@
 .. _svm_examples:
 
-SVM: support vector machine
-------------------------------
+Support Vector Machines
+-----------------------
 
 Examples concerning the `scikits.learn.svm` package.