Administrative Information
Title | Forward propagation |
Duration | 60 |
Module | B |
Lesson Type | Tutorial |
Focus | Technical - Deep Learning |
Topic | Forward pass |
Keywords
Forward pass,Loss,
Learning Goals
- Understand the process of a forward pass
- Understand how to calculate a forward pass prediction, as well as loss unplugged
- Develop a forward pass using no modules in Python (other than Numpy)
- Develop a forward pass using Keras
Expected Preparation
Learning Events to be Completed Before
Obligatory for Students
None.
Optional for Students
- Matrices multiplication
- Getting started with Numpy
- Knowledge of linear and logistic regression ([Lecture: Linear Regression]
References and background for students
- John D Kelleher and Brain McNamee. (2018), Fundamentals of Machine Learning for Predictive Data Analytics, MIT Press.
- Michael Nielsen. (2015), Neural Networks and Deep Learning, 1. Determination press, San Francisco CA USA.
- Charu C. Aggarwal. (2018), Neural Networks and Deep Learning, 1. Springer
- Antonio Gulli,Sujit Pal. Deep Learning with Keras, Packt, [ISBN: 9781787128422].
Recommended for Teachers
None.
Lesson materials
- [ Problem 1 (Image)]
- [ Problem 2 and problem 3 (HTML)]
- Problem 2 and problem 3 (.ipynb)
Instructions for Teachers
- This tutorial will introduce students to the fundamentals of forward propagation for an artificial neural network. This tutorial will consist of a forward pass using pen and paper, using Python with only the Numpy library (for matrices manipulation) and then using KERAS. This will build upon the fundamental understanding of what activation functions apply to specific problem contexts and how the activation functions differ in computational complexity and the application from pen and paper, to code from scratch using Numpy and then using a high level module -> Keras.
- The students will be presented with three problems:
- Problem 1: (Example 1 from the lecture -> Image on the RHS of this WIKI) and asked to conduct a forward pass using the following parameters (20 minutes to complete):
- Sigmoid activation function for the hidden layer
- Sigmoid activation function for the outer layer
- MSE loss function
- Problem 2: (Example 1 from the lecture), students will be asked (with guidance depending on the prior coding experience) to develop a neural network from scratch using only the Numpy module, and the weights and activation functions from problem 1 (which are the same as Example 1 from the lecture (20 minutes to complete).
- Problem 3: (Example 1 from the lecture and using the same example but random weights), students will be asked (with guidance depending on the prior coding experience) to develop a neural network using the Tensorflow 2.X module with the inbuild Keras module, and the weights and activation functions from problem 1, and then using random weights (which are the same as Example 1 from the lecture: 20 minutes to complete).
- Problem 1: (Example 1 from the lecture -> Image on the RHS of this WIKI) and asked to conduct a forward pass using the following parameters (20 minutes to complete):
- The subgoals for these three Problems, is to get students used to the structure and application of fundamental concepts (activation functions, topology and loss functions) for deep learning.
- KERAS and TensorFlow 2.X are used and will be used for all future examples.
Outline
Duration (Min) | Description |
---|---|
20 | Problem 1: Pen and Paper implementation of a forward pass (example from the lecture) |
20 | Problem 2: Developing a neural network from scratch using Numpy (example from the lecture) |
10 | Problem 3: Developing a neural network from using Keras (example from the lecture with set weights and random weights) |
10 | Recap on the forward pass process |
Acknowledgements
The Human-Centered AI Masters programme was Co-Financed by the Connecting Europe Facility of the European Union Under Grant №CEF-TC-2020-1 Digital Skills 2020-EU-IA-0068.