Projects

These are some examples of projects that I have worked on.

Scalable carpooling app prototype

Keywords: Kubernetes, Kafka, Vue, FastAPI, Docker, websockets, MongoDB

This project was developed as a group project for the Web and Cloud Computing course. The product is a prototype app for a carpooling service that allows employees to organize carpooling based on their schedule and itinerary, implemented using a horizontally scalable and fault-tolerant microservice architecture. The app was deployed to the Fuga public cloud. The architecture is built around Kubernetes orchestration, with a stateless FastAPI back-end, a Vue single-page application front-end, a replicated MongoDB database, a NGINX load-balancer, and a Kafka event stream for reliable processing of requests.

React + Redux web app

Keywords: React, Jest, Storybook, Webpack, Node.js, Redux, Yarn

This project implements a web version of the game Regenwormen (Pickomino) as practice project for learning more about modern front-end tooling. The project was developed as a Yarn multirepo, with one package for the game logic, one package for a React front-end, and one package containing a Node.js CLI interface to the game. Jest is used for snapshot testing based on Storybook stories and simulated user interactions through user-event. The repository is configured using Github workflows to automatically run all Jest tests and deploy the front-end bundle to a Github Pages site. It is possible to observe the Redux store using the Redux DevTools

Demo

Signal processing pipeline for separating child heartbeat signals

Keywords: Signal processing, machine learning, time series

This was a group project for the Machine Learning course. The goal was to develop a method for extracting the (much weaker) heartbeat signal of an unborn child from a mixed signal that contains electrocardiogram (ECG) data of both mother and child. This was achieved through a combination of filtering and linear regression, leading to a simple and efficient solution to the problem.

Report

Analysis of knowledge in the game Among Us using action logics

Keywords: Action logics, epistemic logic, graph processing

This was a group project for the Logical Aspects of Multi-Agent Systems course. I worked on a formal model that describes how agents can keep track of their knowledge of the game as they move through the map. Some optimizations are implemented to remove redundant information from the knowledge model in order to control the combinatorial explosion of possible worlds and make the analysis of non-trivial examples possible. The result is a knowledge graph that can be combined with other knowledge sources to support complex reasoning about the state of the game.

Theory / Implementation / Project website

Estimating the age of ancient Hebrew handwritten fragments through style classification

Keywords: Feature extraction, image processing, uMAP

This project was done as a group project for the Handwriting Recognition course. I worked on a part of the project which aimed to to develop a method for dating fragments of the Dead Sea Scrolls based on the writing style. These fragments were produced over a period of several centuries, during which the shape of some of the letters gradually changed. By looking at these letters, experts can classify the fragment and determine its age. In this project, a pipeline was designed for the purpose of automatically performing this classification. The pipeline uses image processing techniques to segment the page into characters and split those into characteristic parts, and uses UMAP dimension reduction to construct a codebook that can be used for classification of unseen fragments.

Paper (partial)

Gossip for resilient mobile mesh networks

Keywords: Multi-agent systems, message passing, ZeroMQ, C++, CMake

This was a group project for the Design of Multi-Agent Systems course. It deals with the problem of resource sharing in mesh networks, where nodes depend on bandwidth shared by their peers, but have an incentive to be opportunistic by refusing to contribute their own resources to the network. In the context of this project, a decentralized protocol was designed that allows good-faith nodes to learn about the contributions made by other nodes, and blacklist selfish nodes on the basis of this information, making the network more resilient. The protocol was implemented using the ZeroMQ messaging library and its effectiveness in countering opportunistic nodes was verified.

Summary

Simulation model of Stark particle decelerator

Keywords: Scientific computing, NumPy, Python, C++, particle dynamics

This project was done as my bachelor’s thesis at the KVI-CART. It is related to an experimental set-up where a series of small conducting rings drive a specially designed oscillating electric field that is shaped shaped to constrain and decelerate a beam of molecules moving through a vacuum chamber, which can then be trapped and studied. For more context about the experiment, see this related publication. I developed a new simulation as a Python package with some parts written in C++ to relax some assumptions in an existing Matlab simulation. The core of the model is an approximation of the field generated by the configuration of conductors based on parameters taken from COMSOL, where control parameters can be flexibly defined in Python. This field is used to estimate the trajectory of molecules as they move through the setup under the influence of the Stark effect. The Peregrine HPC cluster was used to perform parameter sweeps and study the effect of deformations of the constraining electric field on the distribution of particles. By including a more realistic approximation of the electric field as it changes over time, it was possible to better understand how the field transfers energy to a portion of the molecules.

Looking at lightning strikes through the LOFAR radio telescope

Keywords: Signal processing, data visualization, scientific computing, Python

This was a second-years Honours project, in which I worked on analysis of data measured by the LOFAR radio telescope during thunderstorms. This telescope consists of stations that each comprise a set of radio antennas, which can pick up the electromagnetic waves produced by lightning strikes and by the processes that lead up to the strike. An important challenge was finding the signature of a lightning strike, distinguishing it from other events like cosmic ray impacts, and matching up the same strike signal across different radio stations through beamforming to obtain a stronger signal. To do this efficiently, I developed an interactive tool based on Matplotlib and a collection of methods for triangulation and calibration. This was combined with a signal processing pipeline for filtering the signal in frequency space. This line of research eventually led various publications

Related newspaper article

Transition to new web site for literary student association Flanor

As a board member for Flanor, I worked on the transition to a new system for managing the website and member administration. This involved mapping the requirements for the new system, comparing options, presenting options to members, and executing an orderly transition by transferring data and domain names to the new SaaS environment (Congressus). The new website allows for easier and clearer communication to members, ticket sales through the website, and reduces the workload for the board by replacing several old systems.

Flanor website