Administrative Information
Title | Neural Networks |
Duration | 60 |
Module | A |
Lesson Type | Lecture |
Focus | Practical - AI Modelling |
Topic | AI Modelling |
Keywords
Neural network,backpropagation,optimization,
Learning Goals
- Learners understand the MLP architectre
- Learners can construct neural networks with fully connected layers and various activation functions
- Learners understand the basic idea of backpropagation and gradient-based methods
Expected Preparation
Learning Events to be Completed Before
Obligatory for Students
- Review of linear algebra and vector calculus.
Optional for Students
None.
References and background for students
Recommended for Teachers
- Familiarize themselves with the demonstration materials.
Lesson materials
Instructions for Teachers
Cover the topics in the lesson outline and demonstrate the concepts using the interactive notebooks (shape of the loss function w.r.t. different regularizers, gradient-based optimization algorithms). Give a brief overview of the code.
Outline/time schedule
Duration (min) | Description | Concepts |
---|---|---|
5 | From logistic regression to perceptron | input, weights, bias, sigmoid function |
10 | Multilayer perceptron and matrix multiplications | input layer, hidden layer, output layer |
20 | Derivation of the backpropagation scheme | gradient descent, learning rate, backpropagation |
10 | Activation functions | ReLU, sigmoid, tanh, softmax etc. |
10 | Loss functions for classification and regression | MSE, binary and categorical cross-entropy |
5 | Demonstration |
Acknowledgements
The Human-Centered AI Masters programme was Co-Financed by the Connecting Europe Facility of the European Union Under Grant №CEF-TC-2020-1 Digital Skills 2020-EU-IA-0068.