"False"
Skip to content

Students who have not changed their password since 7 May cannot log in to the student web. This is due to security measures following the cyber attack on 2 May. Read about how to change your password.

printicon
Main menu hidden.

Mathematical Foundations of Artificial Intelligence

Below you can read about possible degree projects in the field of Mathematical Foundations of Artificial Intelligence. At the bottom of the page you will find contact details of researchers active in the field. Feel free to contact any of them or visit their personal page for more information on potential thesis topics in Mathematical Foundations of Artificial Intelligence.

Compressive sensing

Prerequisites: Linear algebra, probability theory, optimization (optional), measure theory (optional).
Compressive sensing is a framework for solving underdetermined systems of (bi-)linear equations under structure constraints. The methods can be applied in practice to reconstruct images from e.g. a very small part of their Fourier spectrum. The algorithms developed in this framework can often only be proven to work for random instances of the problem, necessitating the use of random matrix theory. It is also possible to consider infinite-dimensional versions of the theory.

Distributed and federated optimization

Prerequisites: Optimization
In the age of the internet, we often find ourselves in situations where many users are interconnected, but cannot (or do not want to) send all their information to each other. Distributed and federated learning is a framework for developing methods for such communities of users to optimize a common function which they all only partially know.

Equivariance and neural networks

Prerequisites: Linear algebra, representation theory (optional) Lie groups/algebras (optional), machine learning (optional).
Equivariance is just a fancy way of saying that that a function respects a symmetry. In geometric deep learning, one investigates how neural networks can be designed to automatically obey symmetries.

Neural differential equations

Prerequisites: Differential equations, , Lie groups/algebras (optional).
Neural differential equations is a framework for modeling ‘infinitely deep’ neural networks using dynamical systems. Possible extensions include making the models equivariant to respect symmetries, or considering partial differential equations and connections to physics inspired models.

Operator splitting

Prerequisites: Optimization
Many optimization problems are of the form ‘minimize f(x) + g(x)’ , where f and g are two functions which have fundamentally different properties (one can be smooth and the other non-differentiable, for instance). Operator splitting schemes are methods for efficiently optimizing such functions.

Contact

Axel Flinth
Assistant professor
E-mail
Email
Fredrik Ohlsson
Associate professor
E-mail
Email
Alp Yurtsever
Assistant professor
E-mail
Email
Latest update: 2024-01-17