I have always been fascinated by technology and its creative applications. On this page I have collected the results of my experiments with TouchDesigner (a node-based visual programming software), Processing (a Java-based graphics library with multimedia and graphic management capabilities) and Blender (an opensource 3D creative suite).
Such projects could be extended to a wide range of fields thanks to their interactive and diverse features, whether it is visualizations for concerts and events or the production of three-dimensional data visualization, or for animations that enrich websites and social media content.
Here you can find the projects categorized by the software I used to create them.
TouchDesigner
Processing
Blender
I created an interactive particle system in TouchDesigner using the ParticlesGPU component, which dynamically reacts to movement in real time. The system is driven by a live feed captured with a camera and sent directly to a computer. As the camera detects motion, the particles shift, swirl, and disperse in response, creating a visually engaging and immersive effect. This setup allows for fluid, real-time interaction, making it perfect for live performances, installations, or experimental visual projects. By leveraging GPU acceleration, the system remains highly responsive, delivering smooth and captivating motion-based visuals.
I created a particle system based on instancing, using a custom component to manage the particles' lifecycle. This approach allows precise control over their birth, evolution, and dissolution, enabling the creation of complex and dynamic visual effects. Thanks to instancing, the system is highly optimized and capable of handling a large number of particles without compromising performance. This setup is ideal for interactive installations, visual performances, and advanced graphic experiments, delivering a smooth and highly responsive result that reacts seamlessly to input variations and parameter adjustments.
I created a generative art system that manipulates images and grid structures through displacement, driven by speed and noise components. This approach allows for dynamic and ever-changing visual compositions, where the elements shift, distort, and evolve based on movement and randomized patterns. By integrating speed-based transformations, the system reacts fluidly to input variations, creating organic and unpredictable results. The noise component adds complexity, introducing controlled randomness that enhances the visual depth.