Administrative Information
Title | Model Fitting and Optimization |
Duration | 60 |
Module | A |
Lesson Type | Lecture |
Focus | Technical - Foundations of AI |
Topic | Foundations of AI |
Keywords
logistic regression, model fitting, optimization, gradient descent, Newton's method, numerical stability,
Learning Goals
- To acquire demonstrable knowledge of what logistic regression is
- To acquire demonstrable knowledge of ML inference in non-conjugate models via gradient descent
- To acquire demonstrable knowledge on the design and implementation of gradient-based optimization algorithms
Expected Preparation
Learning Events to be Completed Before
Obligatory for Students
- Review the basics of Bayesian inference and maximum likelihood
- Review elementary vector calculus
Optional for Students
None.
References and background for students
None.
Recommended for Teachers
- Familiarize themselves with the demonstration material
Lesson materials
Instructions for Teachers
Cover the topics in the lesson outline and demonstrate the concepts using the interactive notebook (relationship between the number of iterations, loss value and decision boundary, show various algorithms and the effect of the learning rate). Give a brief overview of the code.
Outline/time schedule
Duration (min) | Description | Concepts |
---|---|---|
5 | Introduction to linear classification | binary classification, decision boundary |
15 | Defining a logistic regression model | class-conditional density, sigmoid function, logistic regression |
15 | Maximum likelihood estimation | binary crossentropy, learning rate, gradient descent |
10 | Implementation details and numerical stability | numerical stability, overflow |
10 | Advanced algorithms | Newton's method, line search |
5 | Demonstration |
Acknowledgements
The Human-Centered AI Masters programme was Co-Financed by the Connecting Europe Facility of the European Union Under Grant №CEF-TC-2020-1 Digital Skills 2020-EU-IA-0068.