Hamburg University of Technology / Institute of Mathematics / Talks German flag

Talks

Search | Managament of Talks (German)

Talks 1 to 10 of 699 | show all

Next page Last page
Date Time Venue Talk
06/18/25 12:00 pm Am Schwarzenberg-Campus 3 (E), Room 3.074 An H-LU Block Preconditioner for the RBF-FD discretized Oseen equations
Michael Koch

The Radial Basis Function Finite Difference method (RBD-FD) is a meshless method for
solving partial differential equations, that gained popularity in the last two decades. It provides an
alternative to the Finite Element and Finite Difference method as it can provide high order of
convergence as well as geometric flexibility. The application of the RBF-FD method to steady state
fluid flow problems (e.g. Stokes or Oseen equations) poses some challenges for iterative solvers as
the resulting saddle point systems are completely non-symmetric (even structurally non-symmetric)
and there can be many non-zero entries per row (for high order discretizations).

In this talk we will briefly introduce RBF-FD method and discuss characteristics of the resulting
saddle point matrix. Then we will present a block preconditioner which utilizes hierarchical matrices.
We demonstrate the effectiveness of this preconditioner with some numerical results for the
three-dimensional Oseen equations for challenging convection directions and domains.

Symbol: Arrow up
06/13/25 10:00 am Building N, Room 0007 For What the Bell Tolls*
David Keyes, Computer, Electrical and Mathematical Sciences and Engineering, King Abdullah University of Science and Technology, Saudi Arabia

With today’s exascale computers requiring 20 to 40 MW and some cloud centers exceeding 100MW, with no slacking of demand in sight, computing is a nonnegligible factor in climate change. For the past three years, we have been finalists in the Gordon Bell Prize with computations that do more with less – that scale up while squeezing out operations and data transfers that do not ultimately impact application accuracy requirements. Scientific and engineering computing has a history of “oversolving” inherited from a period when its cost was small enough to neglect. Today’s market for computing hardware is driven by machine learning applications that are able to exploit lower precision arithmetic. Traditional computational science and engineering are therefore being reinvented to employ lower precision arithmetic and replacement of blocks of operator and field data by low-rank substitutes, where possible without impacting accuracy. We provide examples from various applications, including Gordon Bell Prize finalist research in 2022 in environmental statistics, in 2023 in seismic processing, and in 2024 in genomics and again in climate emulation. The last was awarded the 2024 Gordon Bell Prize in Climate Modeling. In this talk, we will elucidate the algorithmic “secret sauce” shared by these diverse applications for which the (Gordon) Bell tolls.

Symbol: Arrow up
06/10/25 09:00 am Am Schwarzenberg-Campus 3 (E), Room 3.074 Bachelorarbeit: Entrauschen von Lösungen der Maxey-Riley-Gatignol-Gleichung mittels maschinellem Lernen
Durmus Alas

Symbol: Arrow up
05/14/25 12:00 pm Am Schwarzenberg-Campus 3 (E), Room 3.074 and Zoom Pararell-in-Time Methods with an ML based coarse propagator
Abdul Qadir Ibrahim

Iterative parallel-in-time algorithms like Parareal can extend scaling beyond the saturation of purely spatial parallelization when solving initial value problems.
However, they require the user to build coarse models to handle the inevitable serial transport of information in time.
This is a time-consuming and difficult process since there is still limited theoretical insight into what constitutes a good and efficient coarse model.
Novel approaches from machine learning to solve differential equations could provide a more generic way to find coarse-level models for parallel-in-time algorithms.
This talk demonstrates that a physics-informed Fourier Neural Operator (PINO) is an effective coarse model for the parallelization in time of the two-asset Black-Scholes equation using Parareal.
We demonstrate that PINO-Parareal converges as fast as a bespoke numerical coarse model and that, in combination with spatial parallelization by domain decomposition, it provides better overall speedup than both purely spatial parallelization and space-time parallelization with a numerical coarse propagator.

Zoomlink:
https://tuhh.zoom.us/j/81920578609?pwd=TjBmYldRdXVDT1VkamZmc1BOajREZz09

Symbol: Arrow up
04/22/25 10:00 am Am Schwarzenberg-Campus 3 (E), Room 3.074 Masterarbeit: Simulation of Waves in a Wave Flume with Bathymetry Using the Euler Equations
Christoph Zetek

Symbol: Arrow up
04/15/25 10:00 am Am Schwarzenberg-Campus 3 (E), Room 3.074 Bachelorarbeit: Physik-gestützte Gauß-Prozess-Regression
Salva Iqbal

Symbol: Arrow up
04/14/25 09:00 am Am Schwarzenberg-Campus 3 (E), Room 3.074 Data-Driven Koopman Operator for the Maxey-Riley-Gatignol Equation [Bachelorarbeit]
Argjent Zulfiu

Symbol: Arrow up
04/09/25 03:00 pm Am Schwarzenberg-Campus 3 (E), Room 3.074 Counting relative to random sets
Peter Allen, Department of Mathematics, London School of Economics

Conlon and Gowers in 2016 described a general approach to proving sparse random analogues of extremal results in combinatorics, such as bounding the minimum and maximum number of triangles in any subgraph of G(n,p) with a given number of edges. The general part of this approach is a functional-analytic statement which, given a sparse setting, constructs a dense model. However, there is a condition which must be shown to hold with high probability to apply the dense model theorem. In Conlon and Gowers' work, there is a technical difficulty with the probabilistic part which leads to a rather involved proof, which applies only in a restricted setting (for example, they can handle triangles but not triangles with a pendant edge), and with quite poor bounds on 'high probability'.

We revisit Conlon and Gowers' approach, and show how to avoid their technical problem, giving a simpler proof of their counting result which applies in a general setting and with optimal probability bounds. As a corollary, we prove the 'Counting KLR' theorem of Conlon, Gowers, Samotij and Schacht, but for general hypergraphs and with optimal probability bounds. This is joint work with Julia Boettcher, Joanna Lada and Domenico Mergoni.

Symbol: Arrow up
04/08/25 02:00 pm Am Schwarzenberg-Campus 3 (E), Room 3.074 Intervallrechnungen im Problem der kollektiven Entscheidungsfindung
Olga Zhukovska

Ungefähre Themen des Vortrags:
1. Probleme der Intervallberechnung.
2. Die Beziehung zwischen Intervallanalyse und Wahrscheinlichkeitstheorie.
3. Bayesianisches Modell der kollektiven Entscheidungsfindung und seine Intervallverallgemeinerung.

Symbol: Arrow up
04/02/25 12:00 pm Am Schwarzenberg-Campus 3 (E), Room 3.074 and Zoom An introduction to human-centric and interactive machine learning
Prof. Pierre-Alexander Murena, Schools of Study Mechanical Engineering

Machine learning is often explored and developed in isolation from real-world contexts, primarily focusing on abstract tasks or benchmark datasets. However, when integrated into real-world applications, a crucial factor comes into play: human involvement. This talk will introduce human-centric machine learning, a growing field that emphasizes the need to account for human presence at every stage of the ML pipeline. I will present several practical examples illustrating how machine learning can be made more interactive and accessible to non-experts

Zoomlink:
https://tuhh.zoom.us/j/81920578609?pwd=TjBmYldRdXVDT1VkamZmc1BOajREZz09

Symbol: Arrow up

* Talk within the Colloquium on Applied Mathematics