Ogni modulo equivale a 3 crediti ECTS. È possibile scegliere un totale di 10 moduli/30 ECTS nelle seguenti categorie:
- 12-15 crediti ECTS in moduli tecnico-scientifici (TSM)
I moduli TSM trasmettono competenze tecniche specifiche del profilo e si integrano ai moduli di approfondimento decentralizzati. - 9-12 crediti ECTS in basi teoriche ampliate (FTP)
I moduli FTP trattano principalmente basi teoriche come la matematica, la fisica, la teoria dell’informazione, la chimica ecc. I moduli ampliano la competenza scientifica dello studente e contribuiscono a creare un importante sinergia tra i concetti astratti e l’applicazione fondamentale per l’innovazione - 6-9 crediti ECTS in moduli di contesto (CM)
I moduli CM trasmettono competenze supplementari in settori quali gestione delle tecnologie, economia aziendale, comunicazione, gestione dei progetti, diritto dei brevetti, diritto contrattuale ecc.
La descrizione del modulo (scarica il pdf) riporta le informazioni linguistiche per ogni modulo, suddivise nelle seguenti categorie:
- Insegnamento
- Documentazione
- Esame
The purpose of this module is to enhance students' understanding of deep learning techniques.
We will explore significant and current developments in deep learning, including generative models, attention networks, transformers, graph neural networks and other related techniques.
Furthermore, we will examine case studies that pertain to language, speech, or visual processing domains.
Requisiti
Machine Learning Basics, Deep Learning, Programming (Python), Statistics, Linear Algebra
Obiettivi di apprendimento
Learning objectives:
- Know the Attention and Transformer architectures and how to use them in different applications.
- Know which architectures are suitable for which kind of machine learning problem
- Know how to apply self-supervised learning and the advantages and disadvantages of this method
- To know and apply the different methods of generative algorithms.
- Know how to interpret and explain results of predictions.
Contenuti del modulo
- Refresher of basic deep learning methods such as MLP, CNN, RNN, Architectures, Backpropagation etc.
- Attention and transformers: concepts and applications in language and vision domains
- Self-supervised learning: contrastive and non-contrastive methods
- Graph Neural Networks and applications
- Generative models: VAEs, GANs, Diffusion models, Energy models, etc.
- Foundation models, few and zero-shot learning
- Interpretability, Explainability
- Special topics of the year, such as NNs for physical systems, 3D reconstruction, theoretical approaches and other
Metodologie di insegnamento e apprendimento
- Lectures / presence
- Tutorial / presence
- Exercises / presence
- Self-study
Scarica il descrittivo completo del modulo
Indietro