[this page on wiki][index][EN][BG][CS][DA][DE][EL][ES][ET][FI][FR][GA][HR][HU][IT][MT][NL][PL][PT][RO][SK][SL][SV]

Lecture: Forward propagation

Administrative Information

Title Forward propagation
Duration 60
Module B
Lesson Type Lecture
Focus Technical - Deep Learning
Topic Forward pass

Keywords

Forward pass, Loss,

Learning Goals

Expected Preparation

Learning Events to be Completed Before

None.

Obligatory for Students

None.

Optional for Students

  • Matrices multiplication
  • Getting started with Numpy
  • Knowledge of linear and logistic regression (from Period A Machine Learning : Lecture: Linear Regression, GLRs, GADs)

References and background for students

  • John D Kelleher and Brain McNamee. (2018), Fundamentals of Machine Learning for Predictive Data Analytics, MIT Press.
  • Michael Nielsen. (2015), Neural Networks and Deep Learning, 1. Determination press, San Francisco CA USA.
  • Charu C. Aggarwal. (2018), Neural Networks and Deep Learning, 1. Springer
  • Antonio Gulli,Sujit Pal. Deep Learning with Keras, Packt, [ISBN: 9781787128422].

Recommended for Teachers

None.

Lesson materials

Instructions for Teachers

This lecture will introduce students to the fundamentals of forward propagation for an artificial neural network. This will introduce students to the topology (weights, synapses, activation functions and loss functions). Students will then be able to do a forward pass using pen and paper, using Python with only the Numpy library (for matrices manipulation) and then using KERAS as part of the tutorial associated with this LE. This will build fundamental understanding of what activation functions apply to specific problem contexts and how the activation functions differ in computational complexity. In the lecture the outer layer activation function and corresponding loss functions will be examined for use cases such as binomial classification, regression and multi-class classification.

Note:

Outline

Neural network used in these introductory lecture series
Time schedule
Duration (Min) Description
10 Definition of Neural Network Components
15 Weights and Activation functions (Sigmoid, TanH and ReLu)
15 Loss functions (Regression, binomial classification, and multi class activation)
15 Using matrices for a forward pass
5 Recap on the forward pass

Acknowledgements

The Human-Centered AI Masters programme was Co-Financed by the Connecting Europe Facility of the European Union Under Grant â„–CEF-TC-2020-1 Digital Skills 2020-EU-IA-0068.