Administrative Information
Title | Batch Processing |
Duration | 60 mins |
Module | B |
Lesson Type | Lecture |
Focus | Technical - Deep Learning |
Topic | Batch processing |
Keywords
Batch, MiniBatch, Epoch,
Learning Goals
- Understand mechanisms behind batch processing and back propagation
- Gradient Descent, Batch vs MiniBatch vs Epoch
Expected Preparation
Learning Events to be Completed Before
Obligatory for Students
None.
Optional for Students
None.
References and background for students
Recommended for Teachers
None.
Lesson materials
Instructions for Teachers
See lecture material for information and example class questions.
Outline
Duration (Min) | Description |
---|---|
10 | Illustration of Gradient Descent |
10 | Recap of loss-function |
10 | Idea of and reasons for Batching |
5 | Batch Gradient Descent |
5 | Stochastic Gradient Descent |
5 | Mini-Batch Gradient Descent |
10 | Algorithm for one epoch |
5 | Wrap-up and questions |
Acknowledgements
Monica Zuccarini, Maddalena Molaro & Carlo Sansone
The Human-Centered AI Masters programme was Co-Financed by the Connecting Europe Facility of the European Union Under Grant №CEF-TC-2020-1 Digital Skills 2020-EU-IA-0068.