Seminars, for informal dissemination of research results, exploratory work by research teams, outreach activities, etc., constitute the simplest form of meetings at a Mathematics research centre.
CAMGSD has recorded the calendar of its seminars for a long time, this page serving both as a means of public announcement of forthcoming activities but also as a historic record.
Given a weight $w$ on the unit circle, consider the orthogonal polynomials on the unit circle generated by $w$. Steklov famously conjectured that if $w$ is bounded below, then the polynomials all ought to be uniformly bounded above. While false, this conjecture begs the follow-up question: under what regularity conditions on $w$ are the polynomials uniformly bounded in $L^p(w)$ for some $p\gt 2$? Building upon a preliminary answer given by Nazarov for when $w$ is bounded above and below, we provide a positive answer when $w$ is an $A_2$ weight. This is joint work with Alexander Aptekarev and Sergey Denisov.
–
Room 6.2.33, Faculty of Sciences of the Universidade de LisboaInstituto Superior Técnicohttps://tecnico.ulisboa.pt
In this talk, we address the localization of general nonlocal functionals of double-integral type with fractional dependence on the state variable, inspired by peridynamics. Localization is carried out as the interaction horizon among particles tends to zero. As a main result, we obtain an explicit formulation of the local $Γ$-limit, also covering the vectorial case. Applications of this result to nonlinear elasticity and the p-Laplacian eigenvalue problem will be discussed.
The Hopfield Neural Network has played, ever since its introduction in 1982 by John Hopfield, a fundamental role in the inter-disciplinary study of storage and retrieval capabilities of neural networks, further highlighted by the recent 2024 Physics Nobel Prize.
From its strong link with biological pattern retrieval mechanisms to its high-capacity Dense Associative Memory variants and connections to generative models, the Hopfield Neural Network has found relevance both in Neuroscience, as well as the most modern of AI systems.
Much of our theoretical knowledge of these systems however, comes from a surprising and powerful link with Statistical Mechanics, first established and explored in seminal works of Amit, Gutfreund and Sompolinsky in the second half of the 1980s: the interpretation of associative memories as spin-glass systems.
In this talk, we will present this duality, as well as the mathematical techniques from spin-glass systems that allow us to accurately and rigorously predict the behavior of different types of associative memories, capable of undertaking various different tasks.