interaction design.

I have always been fascinated by technology and its creative applications. On this page I have collected the interaction design project I've developed in TouchDesigner.

Such projects could be extended to a wide range of fields thanks to their interactive and diverse features, whether it is visualizations for concerts and events or the production of three-dimensional data visualization, or for animations that enrich websites and social media content.

Here you can find the projects categorized.

Motion reactive particles
Particles life with custom component
Random grid displacement
Audio reactive visuals
Motion controlled visual

TouchDesigner
Motion reactive particles

I created an interactive particle system in TouchDesigner using the ParticlesGPU component, which dynamically reacts to movement in real time. The system is driven by a live feed captured with a camera and sent directly to a computer. As the camera detects motion, the particles shift, swirl, and disperse in response, creating a visually engaging and immersive effect. This setup allows for fluid, real-time interaction, making it perfect for live performances, installations, or experimental visual projects. By leveraging GPU acceleration, the system remains highly responsive, delivering smooth and captivating motion-based visuals.

Particles life with custom component

I created a particle system based on instancing, using a custom component to manage the particles' lifecycle. This approach allows precise control over their birth, evolution, and dissolution, enabling the creation of complex and dynamic visual effects. Thanks to instancing, the system is highly optimized and capable of handling a large number of particles without compromising performance. This setup is ideal for interactive installations, visual performances, and advanced graphic experiments, delivering a smooth and highly responsive result that reacts seamlessly to input variations and parameter adjustments.

Random grid displacement

I created a generative art system that manipulates images and grid structures through displacement, driven by speed and noise components. This approach allows for dynamic and ever-changing visual compositions, where the elements shift, distort, and evolve based on movement and randomized patterns. By integrating speed-based transformations, the system reacts fluidly to input variations, creating organic and unpredictable results. The noise component adds complexity, introducing controlled randomness that enhances the visual depth.

Audio reactive visual

As a research project I created an audio reactive visual for a sample produced by Born By Chance. The visual itself is created by a combination of animated noise, feedback loops and post processing the rendering output. For the audio reactive setup I used pre made tools adapted through logic and algorithms. The result is two parallel noise clouds with thunder-like lines in between, the whole system react to the kick and rythm of the audio track.

Motion controlled visual

As a research project I created a motion reactive visual using Media Pipe by Torin Blankensmith. The visual itself is a stock video that is being displaced and colour altered through a combination of post processing techniques. The parameters are controlled with Media Pipe, a GPU Accelerated motion tracking tool that allows for precise and instant motion tracking using a live video feed.

The same technique as been used to control the position and rotation of shapes.