This is now an inactive research group it's members have moved on. You can find them at their new research groups:
webbox(isis.menu.resources)

Support Vector Machines

The foundations of Support Vector Machines (SVM) have been developed by Vapnik [1], and are gaining popularity due to many attractive features,and promising empirical performance. The formulation embodies the Structural Risk Minimisation (SRM) principle, as opposed to the Empirical Risk Minimisation (ERM) approach commonly employed within statistical learning methods. SRM minimises an upper boound on the generalisation error, as opposed to ERM which minmises the error on the training data. It is this difference which equips SVMs with a greater potential to generalise, which is our goal in statistical learning. The SVM can be applied to both classification and regression problems.

Sparse Kernel Modelling

Tutorials

Research Links

Publications

  • A comprehensive bibliography of SVM papers is maintained by Alex Smola and Bernhard Schölkopf.

Software

  • MATLAB Support Vector Machine Toolbox

    The toolbox provides routines for support vector classification and support vector regression. A GUI is included which allows the visualisation of simple classification and regression problems. (The MATLAB optimisation toolbox, or an alternative quadratic programming routine is required.)

    Support Vector Classification Support Vector Regression

    The toolbox can be downloaded here. Documentation can be found in [2].

References

[1] V. Vapnik. The Nature of Statistical Learning Theory. Springer-Verlag, New York, 1995.
[2] S. R. Gunn. Support Vector Machines for Classification and Regression. Technical Report, Image Speech and Intelligent Systems Research Group, University of Southampton, 1997.
© School of Electronics and Computer Science of the University of Southampton