The Department of Mathematics and Systems Analysis organizes regular colloquia on topics in mathematics and systems analysis for a non-specialist audience. Informal discussion continues after the colloquium in the common room.

- 10.9. 15:15 Prof. Martin Lotz (University of Warwick): Pfaffian Incidence Geometry and Applications – M1 (M232)
Pfaffian functions are real or complex analytic functions that satisfy triangular systems of first-order partial differential equations with polynomial coefficients. Pfaffian functions, and by extension Pfaffian and semi-Pfaffian sets, play a crucial role in various areas of mathematics. Incidence combinatorics has recently experienced a surge of activity, fuelled by the introduction of the polynomial partitioning method of Guth and Katz. While traditionally restricted to simple geometric objects such as points and lines, focus has shifted towards incidence questions involving higher dimensional algebraic or semi-algebraic sets. We present a generalization of the polynomial partitioning method to semi-Pfaffian sets and illustrate how this leads to generalizations of classic results in incidence geometry, such as the Szemerédi-Trotter Theorem. Finally, we outline an application of semi-Pfaffian geometry to the robustness of neural networks.

- 13.8. 15:15 Prof. Bruno Fanzeres (Pontifical Catholic University of Rio de Janeiro): Task-Based Prescriptive Trees for Two-Stage Linear Decision-Making Problems: Reformulations, Heuristic Strategies, and Applications – M1 (M232)
Most decision-making under uncertainty problems found in industry and studied by the scientific community can be framed as a two-stage stochastic program. In the past decades, the standard framework to address this class of mathematical programming problems follows a sequential two-step process, usually referred to as estimate-then-optimize, in which a predictive distribution of the uncertain parameters is firstly estimated, based on some machine/statistical learning (M/SL) method, and, then, a decision is prescribed by solving the two-stage stochastic program using the estimated distribution. In this context, most M/SL methods typically focus only on minimizing the prediction error of the uncertain parameters, not accounting for its impact on the downstream decision problem. However, practitioners argue that their main interest is to obtain near-optimal solutions from the available data with minimum decision error rather than a least-error prediction. Therefore, in this talk, we discuss the new framework referred to as task-based learning in which the M/SL training function also accounts for the downstream decision problem. As the M/SL method, we focus on decision trees, and study decision-making problems framed as a two-stage linear program. We present an exact Mixed-Integer Linear Program (MILP) formulation for the task-based learning method and construct two efficient recursive-partitioning Heuristic Strategies for the MILP. We conclude the talk by analyzing a set of numerical experiments illustrating the capability and effectiveness of the task-based prescriptive tree learning framework, benchmarking against the standard estimate-then-optimize framework, and discussing the computational capability of the constructed heuristic strategies vis-à-vis the MILP formulation.

- 14.5. 15:15 Prof. Alberto Ravagnani (Eindhoven University of Technology): The Service Rate Region Polytope – M1 (M232)
In distributed data storage, information is distributed across multiple servers with redundancy, in such a way that multiple users can access it at the same time. The access requests that a distributed data storage system can support are described by a convex polytope, called the service rate region of the system. This talk is about the properties of the service rate region, and about how the algebra of the system determines the geometry of the corresponding polytope.

- 9.4. 15:15 Prof. Paul Van Dooren (UCLouvain): Assigning Stationary Distributions to Stochastic Matrices – M1 (M232)
The target stationary distribution problem (TSDP) is the following: given an
irreducible stochastic matrix G and a target stationary distribution μ^, construct
a minimum norm perturbation, ∆, such that G^ = G + ∆ is also stochastic and
has the prescribed target stationary distribution, μ^. We first consider rank-1
perturbations ∆ and show how to efficiently minimize their norm when such a
solution is feasible. But sparsity and/or connectivity of the graph of G + ∆ may
then get lost. We then impose a constraint on the support of ∆, that is, on the
set of non-zero entries of ∆. This is particularly meaningful in practice since
one cannot typically modify all entries of G. We first show how to construct
a feasible solution G^ that has essentially the same support as the matrix G.
Then we show how to compute globally optimal and sparse solutions using
the component-wise l_1 norm and linear optimization. We propose an efficient
implementation that relies on a column-generation approach which allows us to
solve sparse problems of size up to 10^5 × 10^5 in a few minutes.

- 12.3. 15:15 Prof. Henrik Garde (Aarhus University): Obstacles in Calderóns inverse conductivity problem – U6 KONECRANES (U149)
I will discuss the inverse conductivity problem on recovering interior information about the electrical conductivity of a body from exterior electrical measurements. I will show how one can reconstruct the exact shape and position (called obstacles/inclusions) of inhomogeneities, using energy-comparisons with Neumann-to-Dirichlet maps.
The method is at the same time both simple and surprisingly general, allowing inhomogeneities with parts that are finite positive and negative perturbations, parts that are superconducting or insulating, and parts originating from a Muckenhoupt weight (leading to degenerate/singular problems). The method can also recover collections of cracks in the form of hypersurfaces.
If time permits it, I will also discuss how to handle more practical electrode models and noisy measurements in a rigorous way.

- 13.2. 15:15 Prof. Eeva Vilkkumaa (Aalto University): Supporting the Development of a Robust, Market-Shaping Strategy with Scenario-Based Portfolio Decision Analysis: Case Study with Nordea – U5 (U147)
Strategic decision-making is challenging due to multiple strategic objectives and long planning horizons that make it difficult to assess the future impacts of proposed strategic actions with respect to these objectives. Moreover, strategy work often requires a balance between preparing for alternative scenarios for the future (i.e., developing a robust strategy), and trying to steer the course of change towards a desirable direction (i.e., developing a market-shaping strategy). We present a model-based framework for supporting the development of a robust, market-shaping strategy. For the purposes of this framework, we develop a new portfolio decision analytic model and algorithms to help generate decision recommendations for selecting strategic actions, when (i) the actions scenario- and objective-specific impacts, the baseline values for these impacts, as well as preferences between strategic objectives are incompletely specified, and (ii) information regarding scenario likelihoods is incomplete and may depend on the selected actions. This framework is applied in a high-impact case on supporting the strategy process at the payments unit of Nordea Bank Abp, the largest retail bank in the Nordic countries.

- 9.1. 15:15 Prof. Guillermo Mantilla-Soler (Universidad Nacional de Colombia): Analogies between classic arithmetic and function fields – U6 KONECRANES (U149)
In his talk I will explain how similarities between the integers and the polynomial ring over a field allow us to find, and prove, connections between objects that at first glance seem to be quite different. As an example of this I will show how Lagranges interpolation theorem is nothing else but a particular case of the Chinese reminder theorem. If time allows I will show how a simple result on polynomials leads to the statement of the famous ABC conjecture.

- 12.12. 15:15 Prof. Ting Xue (University of Melbourne): Springer theory and finite groups of Lie type – U6 KONECRANES (U149)
Springer theory for reductive algebraic groups plays an important role in determining irreducible characters of finite groups of Lie type. We discuss its generalisation to the setting of graded Lie algebras. We explain how level-rank dualities arise from unipotent irreducible characters and their connections with the graded Springer theory. If time permits, we discuss a conjectural realisation of these dualities using affine Springer fibers.

- 14.11. 15:15 Prof. Yuji Nakatsukasa (University of Oxford): Numerical Linear Algebra: direct, iterative, and randomized methods – Hall E (Y124)
In many scientific computing and machine learning problems, we expend the majority of our computational resources in solving large-scale linear algebra problems, typically linear systems Ax = b, eigenvalue problems Ax = λx, or the singular value decomposition A = USV'. Numerical linear algebra (NLA) is a research field that attempts to devise practical algorithms for solving these problems.
Broadly, methods in NLA can be divided into three categories: direct, iterative, and randomized. In this talk I will give a whistle-stop tour of these classes of methods, highlighting the incredible robustness of classical (direct) methods, and the exciting speed and advances in randomized methods.

- 10.10. 15:15 Érika Roldán, Ph.D. (Max Planck Institute for Mathematics in the Sciences): Topology and Geometry of Random Cubical Complexes – U6 (U149)
In this talk, we explore the expected topology (measured via homology) and local geometry of two different models of random subcomplexes of the regular cubical grid: percolation clusters, and the Eden Cell Growth model. We will also compare the expected topology that these average structures exhibit with the topology of the extremal structures that it is possible to obtain in the entire set of these cubical complexes. You can look at some of these random structures here (https://skfb.ly/6VINC) and start making some guesses about their topological behavior.

- 9.5. 15:15 Prof. Federico Poloni (University of Pisa): Centrality measures on Markov chains, with applications to roads and infection models – M1 (M232)
We describe a couple of centrality measures on graphs that can be obtained from certain Markov chain models associated to them, and their computation with methods taken from numerical linear algebra.
The Kemeny constant is a quantity that measures the connectedness of a Markov chain by studying certain properties of the random walk associated to it. The variation in the Kemeny constant can be used to identify edges whose removal would alter the connectivity of a network; this is useful information, for instance, in planning urban and regional road networks.
For problems based on "spreading" on a graph, such as news propagation and infectious disease modelling, instead models based on a single random walker fall short: they are unable to capture characteristics of the model such as the time to saturation. We study this phenomenon, and propose an alternative way to treat computationally the full model, which can be interpreted as another Markov chain with an exponential
number of states. The resulting metric can once again be interpreted as a measure of the centrality of the vertices / agents in the network in the propagation.

- 11.4. 15:15 Kash Barker, Ph.D., (University of Oklahoma, USA): Two-Stage Stochastic Program for Environmental Refugee Displacement Planning – M1 (M232)
Forced displacement is a global problem that requires planning for the relocation and integration of displaced people. Most studies focus on conflict-driven forced displacement, and hence the refugee resettlement problem. These studies generally focus on short-term planning and assume that demand within the fixed time interval is given. However, forced displacement, including environmental displacement as well as conflict-driven displacement, is not a one-time event. On the contrary, it is an ongoing and long-term process with dynamic parameters. We are interested in the long-term displacement problem, especially for climate-driven cases in which people will be forced to leave uninhabitable regions in to escape slow-onset climate change impacts such as water stress, crop failure, and sea level rise. To reflect the long-term planning requirements of the climate-driven displacement problem in the parameters and the model, we propose a two-stage stochastic program where demand uncertainty is represented with various demand scenarios, demand and capacity are managed dynamically, and integration outcomes and related costs are optimized.

- 14.3. 15:15 Iván Blanco Chacón (University of Alcalá, Madrid ): From Number Theory to postquantum Cryptography. Ten years (at least) of travel. – U6 (U149)
Euler didn't conceive his notorious theorem as an efficient manner to cipher messages, but two centuries later, his result backs the omnipresent RSA cryptosystem. Neither Abel, nor Poincaré were specially concerned on how to communicate messages in a secure manner when they tackled elliptic integrals and still, elliptic curves are at the basis of the SSL and TLS Internet protocols.
With the frantic development of quantum computing (IBM announced Osprey three months ago, a 433 qbits processor, beating its already commercialised 21 qbits QSystem1 ), we must set ourselves en guard as soon as possible. This is the reason why the NIST launched a public contest to standardise postquantum cryptographic primitives in 2017, recently resolved in July 2022. However, the mathematical tools backing these new proposals are, if no more complicated, at least more challenging than the previous ones.
The goal of my talk is to mention my research lines developed since 2011 until now, a journey which started in Barcelona with such ethereal topics as Shimura curves, modularity and p-adic L-functions and led me to questions as designing efficient codes, crypto-analysing postquantum primitives while still working in more mystic maths in my free time.

- 14.2. 15:15 Prof. Anita Schöbel (RPTU Kaiserslautern and Fraunhofer ITWM): Robust multi-objective optimization – U6 (U149)
Most real-world optimization problems contain parameters which are not known at the time a decision is to be made. In robust optimization one specifies the uncertainty in a scenario set and tries to hedge against the worst case.
Classical robust optimization aims at finding a solution which is best in the worst-case scenario. It is a well-studied concept but it is known to be very conservative: A robust solution comes with a high price in its nominal objective function value. This motivated researchers to introduce less conservative robustness concepts in the last decade. Moreover, many real-world problems involve not only one, but multiple criteria. While robust single-objective optimization has been investigated for 25 years, robust multi-objective optimization is a new field in which already the definition of "robust" is a challenge.
In the talk, several robustness concepts will be discussed and
illustrated at applications from public transport.