Yan Tóth

I am an independent programmer and researcher in love with virtual worlds, game development, and compilers.

I've been programming since 2009, and since 2016 I have predominantly worked on computer graphics and digital arts projects, such as the Vectary online 3D editor, the DungeonFog map editor, and various interactive models and installations with Subdigital.

I believe in first principles thinking, and that programmers should not be adding incidental complexity to the software they create. I do have an academic appreciation for strange and abstract things, but lately I prefer simpler tools where the programmer can understand what is going on under the hood.

Works

Celestial Fruit on Earthly Ground

Interactive online installation

I am particularly proud of my contribution to the realization of celestialfruit.org with Jonathan Reus. The installation is an ongoing and open interactive artwork. It’s conceived as a living tapestry of visual and sonic traces gathered through artistic collaborations and micro-commissions.

The work focuses on the five-string banjo, a historically complex musical instrument that can be thought of as a map of material-embodied cultures, packed with the stories and ideas of people; their migrations and exchanges. Moving around through an online virtual universe, the visitor navigates and reassembles these elements in a way that mirrors the transmission and transformations of culture itself as a living, moving, and complex entity. Besides the interactive experience, the work also operates as a method and engine for driving future collaborations at the meeting points of musical instruments, technology, and tradition.

This collaboration flexed my programming muscles in more than one way. It might be my first finished project where experimentation and changing the feature set regularly was the go to modus operandi instead of something unwelcome. During the three-month initial development period, I had to keep some semblance of structure while not overoptimizing or locking myself into a design corner. It sort of worked and I am not terribly unhappy with the resulting codebase, although the amount of cleanup tasks this produced is daunting. I am sure I'll get to that backlog one day (:

Having to implement a data driven scene building system and an animation system with simple scripting all over again makes me think whether I should finally bite the bullet and learn a game engine like Godot or Unreal instead of always rolling my own. I do admit that the tight time constraints made me step outside of my comfort zone and use way more framework-ey libraries than I otherwise would. The experience was as expected: there were a lot of out-of-the-box features, and a lot of out-of-the-box bugs. If I had to guess, the features still outweighted the extra trouble, even if it made me grumpy at times.

A gourd; has secrets inside

Monoceros

A Wave Function Collapse Solver and Grasshopper plugin

In 2020, Subdigital hosted a summer internship with the aim to develop and utilize a discrete materialization method of data-driven design with available fabrication technology. We picked Wave Function Collapse - a method already well known in the game development community thanks to people like Maxim Gumin and Oskar Stålberg, but still somewhat under-exploited at the time in the fields of computational design and architecture.

Ján Pernecký and I created the software tools necessary for the internship students to experiment with the method. We opted to make the WFC solver a plugin for Grasshopper, an node based programming language for computationally creating shapes and forms.

That initial summer exploration was later expanded on. Driven by usage experience and over the course of many observations, Ján Pernecký collapsed his original WFC tools into Monoceros, a full fledged Grasshopper plugin.

While my own work on Monoceros amounted to mostly brainstorming and rubber-ducking, I am glad it still uses the original solver from the summer because of the optimization work that went into it. To enable fast iteration for designers, we knew the tool needed to respond to parameter changes quickly, ideally real-time. The heavy lifting is therefore implemented in Rust and loaded into Grasshopper as a DLL. Inside the library itself, WFC slots and the modules they contained were represented as compact bit vectors of 256 bits and operated on by just a handful of bitwise instructions and bit-counting intrinsics. While initially worried about the 256 limit on the WFC module count, the projects rarely reach this limit in practice. A future extension might be to fall back to a less compact (and therefore possibly slower) representation for definitions with large module counts.

Bonn voids, designed with Monoceros; picture courtesy of Ján Pernecký & Subdigital

H.U.R.B.A.N. Selector

An experimental procedural geometry modeling and morphing tool

H.U.R.B.A.N Selector is a software experiment initially funded by the Slovak Design Center I work on with Ján Pernecký (Subdigital) and Ondrej Slinták. It is meant to test the hypothesis that creating new designs and shapes is subconsciously inspired by our previous experience. There is a trial and error phase in the design process where many variations on the same shape are prototyped and chosen from.

The software currently has many rough edges (mostly on the usability side), but it strives to be a tool for simple parametric modeling, containing implementations of various morphing strategies for mesh models, allowing designers to smoothly interpolate between multiple mesh geometries and select the result with the most desired features.

Following the explicit history lineage of Grasshopper, we decided that every decision the user makes in the editor should not only be undo-able, but also modifiable at any point. The geometry displayed in the viewport is just the result of applying operations on the initially empty world. These operations can create or import geometry, or transform existing geometries in some way, e.g. perform iterative Laplace Relaxation.

Screenshot of a model morph in H.U.R.B.A.N. Selector

Exposing the declarative nature of editing to the user freed us to utilize a well specified thought model of having an interpreter that evaluates a directed acyclic graph of possibly interdependent operations. Naturally, we didn't want to execute the whole graph of possibly heavy operations on every change to its definition. Simple tricks such as ID-ing the operations instances by their operation type and parameter values allowed us to preserve sub-graphs of computation from previous interactions and only recompute the minimal necessary set of operations.

I was surprised to learn that the UI of some 3D modeling tools (e.g. Rhino, Blender) can freeze up when performing computations on heavy geometries. We knew from the start that some of the topology traversal algorithms necessary to implement the strategy of morphing we initially wanted to do won't have great time complexity, and therefore opted to run the graph interpreter in its own thread and design the UI around this asynchrony from the get-go. As these things usually go, we did't implement the topology based approach to morphing in the end, but at least as a positive side effect the editor doesn't freeze up. Mostly.

The editor itself is written in the Rust programming language, with the use of a few libraries (Cargo.toml). This certainly proved to be a lot of work, but has paid of in some areas - for instance our implementation of some geometry algorithms already outperforms Rhino. Overall the balance between using existing solutions and making something from scratch is tough to strike.

H.U.R.B.A.N. Selector proved to be a lot of work and as our funding was nearing end, we had to sacrifice some of the things we wanted to do. Most notably, we are really unhappy with the current user interface. One of the first things we would do when working on the project again would be to openly admit to and promote the graph structure of the geometry definition, and allow the user to model geometry via a node-based programming environment similar to Grasshopper or Unreal Blueprints.

Screenshot of Simplex Noise generated scalar field materialized via Marching Cubes and smoothed by Loop Subdivision in H.U.R.B.A.N. Selector

Walkers

Interactive installation

Walkers is a single purpose interactive installation built by Ján Pernecký and me to support an exhibition of Júlia Jurinová in 2018. We scanned people walking around the exhibition with Kinect and asynchronously processed their point cloud data to be interpolable by matching vertices between scan frames. Finally, we projected the frame interpolations of the collected scans on a wall in slow motion, rendered as particles with CRT persistence post-processing effect applied to make them look ghostly, each visitor leaving an imprint of themselves behind.

Because this was a one-shot project with an extremely narrow scope, the whole creation process took about a week and we threw many of the coding "best practices" out of the window. There was no code review or code sharing, we worked on separate parts (Ján on geometry processing and me on rendering) with just a shared directory on the file system as the interaction API between our programs. Expecting to find a horrible mess, I was surprised when I looked at the source for the renderer program today. It is surprisingly clean and simple, does exactly one thing and does it in a very concrete way, using just the Webglutenfree library (built in-house for previous projects). While not really proud for using Electron as the platform layer (and would probably hand-roll a native platform layer today), it once again worked well.

Screenshot from the Walkers installation

Bratislava 2022 Model

Interactive urban model of Bratislava, Slovakia

The BA 2022 model is a commercial project I worked on with Subdigital and Abaffy Design for HB Reavis in 2017. It fuses design, engineering and software to explain the connection of large real-estate development to its surroundings in the city of Bratislava. The model's 3x4K screen shows a stylized animated map of the city. The presenter interacts with the model via a connected mobile tablet and controls what the map shows.

While not initially intended, the engine in the model ended up being mostly a compositor of animated layers (each potentially running internal simulations) that could be turned on and off individually. Blending animated layers was a simple mental model for representing features in the engine - every user-visible feature is a either an data-controlled animated layer or a blend thereof.

Even in idling state with no active highlights set by the presenter, the model still displays a lot of PRNG-guided moving elements like transportation and pedestrians. While we knew we would have a powerful desktop computer available, we still wanted to be conservative with the CPU and GPU budget so we partially specialized and precomputed simulation and animation data offline whenever possible.

Because of the requirement to render HTML and the fact that the team was mostly familiar with web technologies at the time, the engine runs on (carefully written) TypeScript and paints pixels via WebGL and a small amount of Chromium-rendered SVG. While that wouldn't be my choice for what is basically an embedded project today, I am still grateful for the iteration speed it gave us.

BA 2022 Model; picture courtesy of Subdigital