I am a Postdoc working on machine learning methods and applications for astrophysics.

I did my PhD and master's degree in physics, and bachelor's degrees in both math and physics at the Ludwig-Maximilians Universität Munich.

The supervisor of my PhD thesis is Torsten Enßlin.

My interests are artificial intelligence, programming and Bayesian statistics.

The main objective of my PhD has been the mapping of interstellar dust within our corner of the milky way. This inference task with hundreds of millions of parameters was carried out on a super-computer and uses the combined data of many stellar surveys. The resulting map covers a volume of about (700pc)³ with a pixel resolution of 1pc (One pc is about 3.3 light years). Due to the Bayesian nature of the method that was specifically designed for this task, we get very precise distance estimates for all larger dust clouds, and uncertainty estimates on all results.

Connect Four is a small strategic board game. In this free-time project, I implemented a simple Q-learner that learns to play the game via self-play. My code implements the network, environment and memory buffer from scratch.

NIFTy is a software library for Gaussian process regression in Python. Its core feature since version one has been the consistent treatment of discretization of continuous fields. Since version 5, it also provides auto-differentiation and the modern inference algorithm metric Gaussian variational inference. I have been contributing to the project since version 3 by implementing features and bringing in ideas on the fundamental structure of the package.

Neural networks suffer from the vanishing/exploding gradient problem. In this small free-time project, I investigated the effect of multiplying the gradient with static positive weights. The weights are obtained through statistical considerations, such that the expected magnitude of the gradient is a fixed percentage of the weights when training begins. This leads to faster training in the beginning of the first training epoch, but is overtaken by standard approaches later on. If one could update the weights that precondition the gradient, training might be faster overall.

Here is a list of publications I contributed to. You can also find me on Google scholar.

Solves the information theoretic problem of approximating a Gaussian likelihood with another Gaussian likelihood and shows some proof of concept applications in the non-Gaussian regime.

Identifying and interpreting the most relevant components seen in the multi-frequency sky using a variant of the variational autoencoder.

Comparing the newest version of the RESOLVE algorithm for interferometric radio-imaging with two variants of the CLEAN algorithm that is standard in the field.

Ultraviolet measurements provide evidence that the 2019-2020 dimming event of Betelgeuse was caused by an outflow of material.

An improvement of our previous 3D dust map using combined data of Gaia, 2MASS, PANSTARRS, and ALLWISE.

A Bayesian reconstruction of the black hole at the center of M87 using data of the Event Horizon Telescope.

Bayesian imaging of radio-interferometric data making use of the spatial correlations of the science target, as well as the temporal correlation of the systematic instrument calibration errors.

Inferring a dynamical process together with its dynamic while exploiting the local and causal properties of the dynamic.

Highly resolved maps of cosmic dust within 300pc are computed from Gaia DR2 data using Gaussian process regression.

Simulation schemes for non-linear PDEs can be derived through minimizing Information loss.

New Version of the NIFTy software package for high dimensional high performance image reconstruction.

A general function that ranks approximations of probability distributions is derived from simple axioms.

Expectation values of Gaussian distributions can be calculated using tricks from differential geometry.