My Projects | Overview
Over the past years, I have led and contributed to a variety of projects, some simpler, others more complex. These range from in-house prototypes designed to enhance user experience in radiology, to academic and R&D projects like vision-based robot swarms and agent-based models. I take pride in contributing to projects that are exciting, interdisciplinary, and technically challenging, whether in industry or academia. Most of the projects below were independently managed, and through them I’ve learned that precise, thoughtful planning creates the foundation for creative experimentation.
Vision-based Swarm
A flocking robot swarm using camera only.
Agent-based Modeling
Perception-driven modeling projects.
Collective Foraging
Studying sociality in simulated and human groups.
CoBeXR
A spatial augmented reality system.
Human | Swarm
A human-swarm interaction study.
ModCollBehavior25
A university course I designed and organized.
Neuroimaging
My work with 2-photon microscopes.
Medical Tech
Software prototyping for improved radiology.
VSWRM | A vision-based swarm
In this project, I developed a vision-based robot swarm that uses computer vision to achieve flocking behavior. The robots are equipped with cameras and use visual information to navigate and interact with each other, creating a dynamic and adaptive swarm. Our project was highlighted on the front page of npj Robotics (Nature Portfolio).
The project involved designing the hardware and software architecture of robots purely in python. The swarm is fully decentralized, that is, robots do not communicate with any of the other robots or central elements. They only use their on-board cameras and computations to coordinate their movements. With this minimalist solution we achieved ordered collective swarm movement (flocking).
To enable AI-enhanced vision, I trained a convolutional neural network (CNN) that is "small" enough to detect other robots on the limited edge computing devices (Raspberry PI) of the robots. I increased detection rate by delegating costly inference on a Google Coral TPU. This solution allowed us to implement the world's first truly vision-only swarm under strict bio-plausible limitations (no tracking, no communication, no memory, all compute on-board).
ABM | Agent-based modeling
In my doctoral research, I developed an agent-based modeling framework using a widely used game engine (pygame) to simulate multiple types of collective behavior. This framework includes example behaviors such as collective motion, foraging, and signaling—each of which uses realistic vision for agents to interact.
This solution not only allows real-time user interactions with the conceived models (inherited from the game engine), but also achieves highly optimized simulation times in Python, thanks to various delegated computations behind the pygame engine. I furthermore transferred this solution to a high-performance computational cluster (HPC) using containerization techniques (Singularity) and Slurm scripting.
The resulting framework served as a basis for bachelor-level modeling courses at Humboldt University (Berlin) for multiple years in a row. It also formed the foundation of most of my research projects where agent-based simulations were involved. Finally, the same framework has been integrated into the mixed reality system we developed, allowing real-time embodied interactions with virtual worlds.
Collective Foraging Studies
With the developed vision-based modeling framework, we were able to study what constitutes optimal sociality during collective foraging. We found that not only does the optimal level of social information use change across different foraging environments, but the capabilities of individuals in the group (such as visual FOV) can fundamentally change collective behavior.
We validated our findings in human groups who were foraging in an immersive reality game. Participants had to search for hidden coin sources together, either by joining others or exploring alone. We showed that in complex animals such as humans, not only the previously mentioned factors but also incentives modify overall group behavior.
Our simulation and human behavior study extend each other and form an analytic-synthetic loop, continuously informing one another. For example, we found that, unlike in simplistic simulations, human participants used highly unique personal exploration patterns in the arena, patterns that could identify individuals similarly to handwriting.
📰 Media & Outreach
💬 Social Media (LinkedIn)
CoBeXR | A Spatial Augmented Reality System
CoBeXR is a spatial augmented reality system consisting of four projectors and eight tracking cameras (OptiTrack) on an adjustable metal frame. The system creates a perception–action loop in which tracked objects (for example, tracked humans) can influence the projected landscape in real time. Our software stack achieves a 45Hz feedback rate from action to perception, rendering our system truly real-time.
Such spatial augmented reality systems have a wide range of possible applications—from education to robotics, psychology studies, or even visual arts. In its current form, we transferred a collective fish model to the system, allowing a rendered fish school to be projected onto the ground, which users can interact with using a tracking cane.
Beyond humans, robots can also interact with these virtual swarms if tracked. This enables human–swarm and robot–swarm interaction studies that would otherwise be unfeasible due to the complexity of such a multi-agent system. The real-time interaction between simulated projection and biological agents also enables interactive visual art installations and unique forms of science communication.
📰 Media & Outreach
💬 Social Media (LinkedIn)
Human-swarm Interaction Study Using CoBeXR
CoBeXR allows the tracking of fine-grained motion strategies of biological agents (including humans) to control virtual landscapes and their underlying simulations. In this project, humans were detected by virtual fish as "predators," eliciting an escape response. Our main question was whether humans can anticipate and control the dynamics of the virtual fish swarm without any prior knowledge—only through simple walking movement.
Such a task is non-trivial. Complex swarm dynamics emerge from interactions between the many, individually simple components. These interactions are often not fully predictable, especially when the interaction rules within the swarm are fully unknown. Nonetheless, if swarm systems become everyday reality, humans need to anticipate and control their behavior without complicated interfaces, using natural movement.
To study this question, we allowed human participants to "attack" the fish school in bouts and try to reproduce the "fountain effect," where the fish split in front of the predator, form arches, and rejoin behind the threat. This complex emergent pattern is not easy to reproduce. Participants were rewarded after each successful fountain. We are interested in how (if at all) humans could learn to reproduce this pattern reliably.
Introduction to Modeling Collective Behavior (2025 Summer Semester)
Funded by the Berlin Senate, I had the unique opportunity to design and organize my own university course at the master’s level at Technische Universität Berlin. The course focused on modeling approaches when studying collective behavior. Students had the chance to learn more about collective behavior in nature and swarm robotics from distinguished international speakers. Our seminar series concluded with a hybrid fishbowl discussion about the ethical considerations of collective behavior research and swarm robotics.
After the seminar series, participants explored the concept of agent-based modeling and, under my supervision, implemented their own models of collective behavior using the agent-based modeling framework I developed during my doctoral research. Students were able to use the framework to simulate various collective behaviors, such as collective motion or SIR epidemiological models, and interact with their models like they would with a game.
At last, students transferred their models to CoBeXR and explored them through embodied interactions (by moving a tracking cane) within the projected arena. Depending on their implementation, participants could explore different parameter combinations immersed within their own virtual worlds.
See Video on LinkedInCourse Components
Course Timeline
Neuroimaging and Optogenetics
I worked with multiple neuroimaging technologies including resonant and acousto-optic ultrafast 3D 2-Photon microscopes to study the 3-dimensional functional architecture of the mouse brain. To visualize neurons and smaller functional elements such as dendritic spines, model animals were first "tagged" with a fluorescent protein that allowed translating neuronal activity into light signals.
By presenting controlled visual stimuli to the animals, such as moving stripes, and recording neuronal activity at the same time, we could then decipher how neuronal and dendritic coding of visual stimuli works in the mouse brain. To do so, we designed immersive technologies for the animals that allowed them to experience the visual stimuli while being tracked by the microscope.
Furthermore, we used light-sensitive proteins that allowed us to not only record but control neuronal activity with targeted light stimulation in certain brain areas. This allowed us to decipher fine-grained hidden input-output characteristics in the visual cortex.
Medical Technology
During my work at medneo GmbH as a prototype developer, I developed multiple in-house software tools to facilitate the management of radiological data and patient forms. These included an MRI sequence organizer that allows radiologists to easily manage and select MRI sequences for different examinations through a desktop application developed in Python and PyQt. Additionally, I designed and implemented an automated patient form generation prototype, which was later deployed and integrated into the main infrastructure of the company as widely used in-house software.
The tool allows the personnel to centrally handle patient form–related information for hundreds of contracted imaging centers via a commercially available CMS. Upon changes in the data, the tool automatically regenerates affected patient forms and exports them to a shared server for the convenience of imaging center personnel. Solving this problem is not straightforward given local legal contexts across German states, shared or varying information blocks in vastly different forms, multiple practices of the same contractor, and various other factors that make traditional database-based methods infeasible.
Furthermore, I contributed to a patented technology for semi-automated quality assessment of medical images developed by the company. Our solution allows the company, together with radiologists, to rate the quality of output images within a center and automatically optimize it by providing suggested parameters for the personnel.