Support Vector Machines Explained

A Step-by-Step Introduction to SVMs

Support Vector Machine hard-margin classifier showing the optimal separating hyperplane, margin boundaries H1 and H2, support vectors, and weight vector w

This tutorial paper has been written to make Support Vector Machines (SVMs) as simple to understand as possible for those with minimal experience of Machine Learning. It assumes basic mathematical knowledge in areas such as calculus, vector geometry and Lagrange multipliers.

The document is split into Theory and Application sections so that it is clear, after the maths has been dealt with, how to actually apply the SVM for the different forms of problem that each section is centred on.

What the Tutorial Covers

  • Hard-margin and soft-margin classification
  • The primal and dual formulations
  • Kernel functions and the kernel trick
  • Multi-class SVMs
  • SVM regression
  • Practical guidance on applying SVMs to real data
The kernel trick: non-linearly separable data in two dimensions (left) mapped to a linearly separable representation in higher-dimensional feature space (right)

Download the full tutorial (PDF)

Written by Dr Tristan Fletcher. See also the companion tutorials on Relevance Vector Machines and the Kalman Filter, or browse all ML tutorials.