This is now an inactive research group it's members have moved on. You can find them at their new research groups:

- CSPC - http://cspc.ecs.soton.ac.uk
- EEE - http://eee.ecs.soton.ac.uk

## Support Vector Machines | |

The foundations of Support Vector Machines (SVM) have been developed by Vapnik [1], and are gaining popularity due to many attractive features,and promising empirical performance. The formulation embodies the Structural Risk Minimisation (SRM) principle, as opposed to the Empirical Risk Minimisation (ERM) approach commonly employed within statistical learning methods. SRM minimises an upper boound on the generalisation error, as opposed to ERM which minmises the error on the training data. It is this difference which equips SVMs with a greater potential to generalise, which is our goal in statistical learning. The SVM can be applied to both classification and regression problems.
## Sparse Kernel Modelling## Tutorials- A technical report on Support Vector Machines for Classification and Regression.
- Learning, Approximation and Networks course at MIT.
- A Tutorial on Support Vector Machines for Pattern Recognition by C. Burges.
- A book, Support Vector Machines by Nello Cristianini.
## Research Links## Publications- A comprehensive bibliography of SVM papers is maintained by Alex Smola and Bernhard Schölkopf.
## Software### MATLAB Support Vector Machine ToolboxThe toolbox provides routines for support vector classification and support vector regression. A GUI is included which allows the visualisation of simple classification and regression problems. (The MATLAB optimisation toolbox, or an alternative quadratic programming routine is required.)The toolbox can be downloaded here. Documentation can be found in [2].
## ReferencesThe Nature of Statistical Learning Theory. Springer-Verlag, New York, 1995. |