[this page on wiki][index][EN][BG][CS][DA][DE][EL][ES][ET][FI][FR][GA][HR][HU][IT][MT][NL][PL][PT][RO][SK][SL][SV]

Lecture: Generative Models, Transform Deep Learning and Hybrid learning models

Administrative Information

Title Generative Models, Transform Deep Learning and Hybrid learning Models
Duration 45 - 60
Module C
Lesson Type Lecture
Focus Technical - Future AI
Topic Advances in ML models through a HC lens - A result Oriented Study

Keywords

Generative Models,Attention Detection,Query-Key-Value,Transform models,Hybrid Models,

Learning Goals

Expected Preparation

Obligatory for Students

  • Introduction to machine learning and deep learning concepts given in previous lectures

Optional for Students

Recommended for Teachers

None.

Lesson materials

Instructions for Teachers

In this lecture, our primary objectives are threefold. Firstly, we aim to comprehensively understanding of Generative Models, focusing on their underlying mechanisms and core features. Secondly, we will discuss the significance of Transformer Architectures, particularly in the context of Natural Language Processing (NLP). Lastly, the lecture will elaborate on the various configuration of Hybrid Models, emphasizing the fusion of diverse elements to enhance machine learning performance.

Outline

Duration Description Concepts
15 min Introduction to Generative Models, Classification of Generative Models What are generative models?, Why are they important? What can they be used for? Classification, Key features, Examples
20 min Introduction to the Transformer architectures Transformer architecture, state-of-the-art transformers such as BERT and GTP
10 min Introduction to Hybrid learning What is hybrid learning?, Why is it important?, What can they be used for?
5 min Conclusion, questions and answers Summary

Acknowledgements

The Human-Centered AI Masters programme was Co-Financed by the Connecting Europe Facility of the European Union Under Grant №CEF-TC-2020-1 Digital Skills 2020-EU-IA-0068.