CAMGSD
IST FCT EditPT | EN

Seminars and short courses RSS feed

Seminars, for informal dissemination of research results, exploratory work by research teams, outreach activities, etc., constitute the simplest form of meetings at a Mathematics research centre.

CAMGSD has recorded the calendar of its seminars for a long time, this page serving both as a means of public announcement of forthcoming activities but also as a historic record.

For a full search interface see the Mathematics Department seminar page.

Room P3.10, Mathematics Building Instituto Superior Técnico https://tecnico.ulisboa.pt

Mathematics for Artificial Intelligence

João Costa, CAMGSD & ISCTE.

The goal of these lectures is to give a simple and direct introduction to some of the most basic concepts and techniques in Deep Learning. We will start by reviewing the fundamentals of Linear Regression and Linear Classifiers, and from there we will find our way into Deep Dense Neural Networks (aka multi-layer perceptrons). Then, we will introduce the theoretical and practical minimum to train such neural nets to perform the classification of handwritten digits, as provided by the MNIST dataset. This will require, in particular, the efficient computation of the gradients of the loss wrt the parameters of the model, which is achieved by backpropagation. Finally, if time permits, we will briefly describe other neural network architectures, such as Convolution Networks and Transformers, and other applications of deep learning, including Physics Informed Neural Networks, which apply neural nets to find approximate solutions of Differential Equations. The lectures will be accompanied by Python code, implementing some of these basic techniques.

Room P3.10, Mathematics Building Instituto Superior Técnico https://tecnico.ulisboa.pt

Mathematics for Artificial Intelligence

João Costa, CAMGSD & ISCTE.

The goal of these lectures is to give a simple and direct introduction to some of the most basic concepts and techniques in Deep Learning. We will start by reviewing the fundamentals of Linear Regression and Linear Classifiers, and from there we will find our way into Deep Dense Neural Networks (aka multi-layer perceptrons). Then, we will introduce the theoretical and practical minimum to train such neural nets to perform the classification of handwritten digits, as provided by the MNIST dataset. This will require, in particular, the efficient computation of the gradients of the loss wrt the parameters of the model, which is achieved by backpropagation. Finally, if time permits, we will briefly describe other neural network architectures, such as Convolution Networks and Transformers, and other applications of deep learning, including Physics Informed Neural Networks, which apply neural nets to find approximate solutions of Differential Equations. The lectures will be accompanied by Python code, implementing some of these basic techniques.

Current funding: FCT UIDB/04459/2020 & FCT UIDP/04459/2020.

©2025, Instituto Superior Técnico. All rights reserved.