I am an ML Research Engineer at the ELLIS Institute Tübingen and I work on post-training of multilingual LLMs. Previously, I was a PhD student at Frank Hutter's Machine Learning Group in Freiburg.
Past projects:
- Improving LLM-based Global Optimization with Search Space Partitioning (under review at ICLR 2026)
- Beyond Random Augmentations: Pretraining with Hard Views (1st author, ICLR 2025) [code]
- Quick-Tune: Quickly Learning Which Pretrained Model to Finetune and How (2nd author, ICLR 2024, oral) [code]
- Zero-Shot AutoML with Pretrained Models (1st author, ICML 2022, spotlight) [code]
- Learning Environments for Reinforcement Learning (1st author, ICLR 2022) [code]
My PhD thesis: Meta-Learning and Synthetic Data for Automated Pretraining and Finetuning (supervised by Prof. Frank Hutter)
🧠 Conditional Density Estimation (CDE) – Benchmarks and baseline implementations for cond. density estimation.
🎯 Zero-Shot AutoML with Pretrained Models – Zero-shot selection of strong pretrained models without training.
🎥 video2tfrecord – Convert raw video datasets into scalable TFRecord pipelines.
🤖 mppi_pendulum – Minimal MPPI control implementation for the classic pendulum task.
🧮 Generative Symbolic Regression – Neural translation from tabular data to concise LaTeX equations.
📊 tailgrid – Terminal-based monitoring and visualization of long-running logs.
🧩 slurmfrag – Fine-grained experiment fragmentation and management for SLURM clusters.
🧴 perfume-compare (private) – Browser extension ranking fragrances by price per milliliter.
🌸 perfumefinder.ai (private) – AI-powered personal perfume recommendation assistant.



