[this page on wiki][index][EN][BG][CS][DA][DE][EL][ES][ET][FI][FR][GA][HR][HU][IT][MT][NL][PL][PT][RO][SK][SL][SV]

Lecture: SVMS and Kernels

Administrative Information

Title Lecture: SVMs and Kernels
Duration 60
Module A
Lesson Type Lecture
Focus Practical - AI Modelling
Topic AI Modelling

Keywords

maximum margin classifier, support vector, kernel trick,

Learning Goals

Expected Preparation

Learning Events to be Completed Before

Obligatory for Students

  • Review of analytic geometry (e.g. distance of a point to a plane).

Optional for Students

None.

References and background for students

  • Bishop, Christopher M. (2006). Pattern recognition and machine learning, Chapter 7

Recommended for Teachers

  • Familiarize themselves with the demonstration material.

Lesson materials

Instructions for Teachers

Cover the topics in the lesson outline and demonstrate the effect of the complexity parameter and RBF parameter using the interactive notebooks. Show an example of underfitting. Give a brief overview of the code.

Outline/time schedule

Duration (min) Description Concepts
15 Maximum margin classifiers feature space, separating hyperplane, margin, support vector
10 Soft-margin formulation slack variables, model complexity
10 Dual formulation and optimization Lagrange multipliers, primal and dual problems
10 Support vectors and predictions dual parameters and support vectors
15 Non-linearization and the kernel trick kernel function

Acknowledgements

The Human-Centered AI Masters programme was Co-Financed by the Connecting Europe Facility of the European Union Under Grant №CEF-TC-2020-1 Digital Skills 2020-EU-IA-0068.