DESY Photon Science Users’ Days
I presented our group poster “Computational Imaging@DESY” at the DESY Photon Science Users’ Days.
I presented our group poster “Computational Imaging@DESY” at the DESY Photon Science Users’ Days.
I was happy to give a talk on “Neuronale Netze und adversarial attacks: Wie der Panda zum Gibbon wurde” in the student seminar “Data Science in Forschung und Industrie” organized by Daniel Tenbrinck.
Together with Samira Kabri, Lorenz Kuger, Lukas Weigand and Martin Burger, we presented our group poster “Computational Imaging@DESY” at the Digital Total event in Hamburg.
Together with Samira Kabri, Lorenz Kuger and Martin Burger we organized a course at the Helmholtz Incubator Summer Academy.
I’m currently in Kelowna at the BIRS event Leveraging Model- and Data-Driven Methods in Medical Imaging. I will give a talk on our Bregman learning framework.
I will present the work on continuum limits of Lipschitz learning at the FoCM 2023 as a poster. You can find the poster here. It summarizes the main results of the following papers I wrote together with Jeff Calder and Leon Bungert:
I am currently presenting our work on Bregman learning at the Mathematics and Image Analysis conference 2023 in Berlin.
I’m currently attending the winter school on advanced methods for mathematical image analysis in Bologna. The stay is financed via the Erasmus BIP program.
A beamer template
📈 Bregman Learning
Very happy to be in NewOrleans right now at NeurIPS 22 presenting our paper on Bregman learning as a poster.
Together with Leon Bungert and Philipp Wacker we upload our preprint Polarized consensus-based dynamics for optimization and sampling. The code for the algorithm can be found in our repo.
Consensus Based Optimization
Check out our new preprint Ratio convergence rates for Euclidean first-passage percolation: Applications to the graph infinity Laplacian on arXiv. We prove the first quantitative convergence rates for the graph infinity Laplace equation fo...
This week I am visiting my colleague Leon Bungert at the Mittag-Leffler insitute in Stockholm. He is staying there for the program Geometric Aspects of Nonlinear Partial Differential Equations and I am very happy to have the chance to visit him there.
I am very happy to announce that our paper on Bregman learning will be presented as a poster at NeurIPS 22. You can checkout the poster on ResearchGate or on GitHub.
I am very happy to announce that our paper on Uniform Convergence Rates for Lipschitz Learning on Graphs has appeared in the IMA journal of numerical analysis.
fau-book
Today I was given the chance to present our work on convergence rates for the Lipschitz learning problem at the GAMM Annual Meeting 2022 in Aachen.
The paper on Bregman learning for sparse neural networks which I worked on together with Leon Bungert, Daniel Tenbrinck and Martin Burger just got published in the Journal of Machine Learning Research.
Today I had the chance to present our work on a kernelized version of consensus based optimization at the 15th International Conference on Monte Carlo and Quasi-Monte Carlo Methods in Scientific Computing.
Today I presented our work on convergence rates for the Lipschitz learning problem at the conference on calculus of variation in Lille.
Today I presented our work on convergence rates for the Lipschitz learning problem in our joint junior seminar. This seminar takes place within the network of the MoD community.
I was very happy to join the Curves and Surfaces conference in Arcachon. My supervisor Martin Burger organized a Minisymposium together with Daniel Tenbrinck which can be found here.
Today I had the chance to present our work on convergence rates for the Lipschitz learning problem at the HCM - Workshop Synergies between Data Science and PDE Analysis.
Mathematik für Physikstudierende C
Today I had the chance to present our work about stochastic Bregman iterations at the East Coastern Optimization Meeting (ECOM).
Together with Leon Bungert and Daniel Tenbrinck we organized a three-part Minisymposium for the SIAM conference IS22 on Recent Advances on Stable Neural Networks.
I am very happy to join the hackathon “Math Meets Image” in Berlin. It takes place from 17th to 19th March 2022 at The Classroom and is part of MATH+ Thematic Einstein Semester on “Mathematics of Imaging in Real-World Challenges”.
Together with Leon Bungert and Jeff Calder we published a preprint on Uniform Convergence Rates for Lipschitz Learning on Graphs.
Looking forward to join the workshop Dynamics and Discretization: PDEs, Sampling, and Optimization at the Simons Institute in Berkeley.
Was very happy to give a talk for the “Mädchen- und Technik 2021”-week together with Daniel Tenbrinck. This event allows young girls to gain an insight into the academic world.
📈 CLIP
Happy to announce that I will be presenting our paper about Bregman iterations at the 2nd Alps-Adriatic Inverse Problems Workshop which is taking place at Klagenfurt during September 22-24, 2021. I am especially thrilled that my talk is scheduled to be live...
Together with Leon Bungert, Daniel Tenbrinck and Martin Burger we published a new preprint called Neural Architecture Search via Bregman Iterations.
Together with Leon Bungert, Daniel Tenbrinck and Martin Burger we published a new preprint called A Bregman Learning Framework for Sparse Neural Networks.
Published a new preprint called CLIP: Cheap Lipschitz Training of Neural Networks, together with Leon Bungert, Daniel Tenbrinck, Leo Schwinn and René Raab.
Published my first preprint called Continuum Limit of Lipschitz Learning on Graphs, together with Leon Bungert.