mono.project :: readme

mono.project is a generative visual system executing StyleGAN3-based latent interpolations, trained on 124 analog monotypes created by LY between 2019–2020.

It simulates an irreversible walk through non-deterministic latent space, producing an artifact referred to as a “Mono” — a temporally and visually unique video object composed of 1800 frames rendered at 60fps (30s total).

mono.project :: origin

mono.gen is derived from the monotype works of LY (Ukraine, 2019), a collection of over 1,000 silk-printed abstractions inspired by illuminated manuscripts and iterative error.

The concept of mono.gen was first formalized in 2022, and an early installation prototype (StyleGAN2 + post-processing) was exhibited at the 2023 Digital Art Biennale in Rio de Janeiro.

As LY’s son, I continue this experiment in Web3 — transforming a private analog archive into a public generative system.

mono.project :: access_notify

→ you are early.
→ welcome to epoch zero.
Be among the first to walk.

Enter your email to receive private access when the next epoch opens.