Administrative Information
Title | Lecture: SVMs and Kernels |
Duration | 60 |
Module | A |
Lesson Type | Lecture |
Focus | Practical - AI Modelling |
Topic | AI Modelling |
Keywords
maximum margin classifier, support vector, kernel trick,
Learning Goals
- To know what a support vector is and how to find it in a feature space
- To know how a linear SVM works
- To understand the concept of kernel function in the context of SVM
- To know how the kernel trick allows to perform non linear classification with linear SVM
Expected Preparation
Learning Events to be Completed Before
Obligatory for Students
- Review of analytic geometry (e.g. distance of a point to a plane).
Optional for Students
None.
References and background for students
- Bishop, Christopher M. (2006). Pattern recognition and machine learning, Chapter 7
Recommended for Teachers
- Familiarize themselves with the demonstration material.
Lesson materials
Instructions for Teachers
Cover the topics in the lesson outline and demonstrate the effect of the complexity parameter and RBF parameter using the interactive notebooks. Show an example of underfitting. Give a brief overview of the code.
Outline/time schedule
Duration (min) | Description | Concepts |
---|---|---|
15 | Maximum margin classifiers | feature space, separating hyperplane, margin, support vector |
10 | Soft-margin formulation | slack variables, model complexity |
10 | Dual formulation and optimization | Lagrange multipliers, primal and dual problems |
10 | Support vectors and predictions | dual parameters and support vectors |
15 | Non-linearization and the kernel trick | kernel function |
Acknowledgements
The Human-Centered AI Masters programme was Co-Financed by the Connecting Europe Facility of the European Union Under Grant №CEF-TC-2020-1 Digital Skills 2020-EU-IA-0068.