– Europe/Lisbon — Online
Sergiu Klainerman, Princeton University.
Seminários, para a disseminação informal de resultados de investigação, trabalho exploratório de equipas de investigação, actividades de difusão, etc., constituem a forma mais simples de encontros num centro de investigação de matemática.
O CAMGSD regista e publica o calendário dos seus seminários há bastante tempo, servindo páginas como esta não só como um método de anúncio dessas actividades mas também como um registo histórico.
Para uma interface de busca completa ver a página de seminários do Departamento de Matemática.
Sergiu Klainerman, Princeton University.
Sala P3.10, Pavilhão de Matemática
Matemática para Inteligência Artificial
João Costa, CAMGSD & ISCTE.
Introduction to Deep Learning for mathematicians.
The goal of these lectures is to give a simple and direct introduction to some of the most basic concepts and techniques in Deep Learning. We will start by reviewing the fundamentals of Linear Regression and Linear Classifiers, and from there we will find our way into Deep Dense Neural Networks (aka multi-layer perceptrons). Then, we will introduce the theoretical and practical minimum to train such neural nets to perform the classification of handwritten digits, as provided by the MNIST dataset. This will require, in particular, the efficient computation of the gradients of the loss wrt the parameters of the model, which is achieved by backpropagation. Finally, if time permits, we will briefly describe other neural network architectures, such as Convolution Networks and Transformers, and other applications of deep learning, including Physics Informed Neural Networks, which apply neural nets to find approximate solutions of Differential Equations. The lectures will be accompanied by Python code, implementing some of these basic techniques.
Sala P3.10, Pavilhão de Matemática
Matemática para Inteligência Artificial
João Costa, CAMGSD & ISCTE.
Introduction to Deep Learning for mathematicians.
The goal of these lectures is to give a simple and direct introduction to some of the most basic concepts and techniques in Deep Learning. We will start by reviewing the fundamentals of Linear Regression and Linear Classifiers, and from there we will find our way into Deep Dense Neural Networks (aka multi-layer perceptrons). Then, we will introduce the theoretical and practical minimum to train such neural nets to perform the classification of handwritten digits, as provided by the MNIST dataset. This will require, in particular, the efficient computation of the gradients of the loss wrt the parameters of the model, which is achieved by backpropagation. Finally, if time permits, we will briefly describe other neural network architectures, such as Convolution Networks and Transformers, and other applications of deep learning, including Physics Informed Neural Networks, which apply neural nets to find approximate solutions of Differential Equations. The lectures will be accompanied by Python code, implementing some of these basic techniques.
Sala P3.10, Pavilhão de Matemática
Matemática para Inteligência Artificial
Luís Carvalho, CAMGSD & ISCTE.
Universal approximation theorems and reproducing kernel Hilbert spaces.
Sala P3.10, Pavilhão de Matemática
Matemática para Inteligência Artificial
Luís Carvalho, CAMGSD & ISCTE.
Universal approximation theorems and reproducing kernel Hilbert spaces.
Sala P3.10, Pavilhão de Matemática
Matemática para Inteligência Artificial
Gonçalo Oliveira, CAMGSD & Instituto Superior Técnico.
Infinitely wide Neural Networks.
Sala P3.10, Pavilhão de Matemática
Matemática para Inteligência Artificial
Gonçalo Oliveira, CAMGSD & Instituto Superior Técnico.
Infinitely wide Neural Networks.