Week 7: Initial Stage for Creating the TouchDesigner Visual and Audio Pipeline

In the early concept of this project, my goal was to import a virtual plant model into TouchDesigner and transform it into a particle-based form. Rather than keeping the plant as a static or decorative object, I imagined it as a dynamic entity, made of particles that move, shift, and respond in real time. Using TouchDesigner as the visual engine, I began by bringing in the .ply model of the plant. Each point in the 3D mesh was treated as an individual particle, forming the base of a flexible, generative system. The visual output was not just reactive but alive, waiting to be shaped by external data. That data comes from the plant itself. By capturing its electrical signals and converting them into sound, I allowed those signals to influence the particle system directly. The plant’s biodata controls the color and motion of the particles. In this way, its inner state becomes visible and audible This approach challenges the traditional role of plants as passive objects. In many contexts, plants are seen as background elements, valued for decoration or utility. I wanted to change that perception. By allowing a plant to influence its digital counterpart through its own signals, I give it presence and agency. The result is a visual environment shaped by the plant’s expression. Its electrical signals generate a kind of voice that reshapes the virtual space, changing colors and particle paths based on its own rhythms. In this installation, the plant is not silent. It becomes a participant and Speaker.

Step 1: Importing and Preparing the Virtual Plant Model

Drawing from my previous experience with TouchDesigner during earlier UAL coursework, I was already familiar with how to connect different components to build real-time visual systems. This hand-on experience helped me efficiently begin the process of constructing my immersive installation.

I started by importing my virtual plant model into TouchDesigner using the TOP Point File In node. This allowed me to bring in a .ply file containing 3D point cloud data of the plant. I then used Point File Select 1 to extract and map the XYZ translation values to RGB, which determined the positional data for the particle system. At the same time, I used Point File Select 2 to assign RGB values for the original shader, controlling the color of each point based on the model’s embedded attributes.

To visualize the structure, I connected the resulting SOP to a Box SOP and then routed it through a GEO component. I added a Camera, Light, and a Render TOP to complete the basic rendering pipeline. Through this setup, I successfully brought the virtual plant model into TouchDesigner, laying the foundation for further development for my Immersive installation (Fig 1).

Fig 1: Import 3D modelling into Touchdesigner

Step 2: Create Virtual Plant Texture and Re-Adjustment the Particle size

In this sub-step, I began transforming the plant geometry into a particle-friendly structure. Although the virtual plant model had already been converted into a point cloud, I found that the positions of the original vertices were too large to represent the plant particle form clearly. To resolve this, I used a base box with a custom scale size of 0.02 as a visual substitute for each vertex. This allowed me to refine the shape and density of the particles and better sketch the overall silhouette of the plant in its new particle form (Fig 2).

I then assigned point attributes and applied a texture and material to prepare the mesh for visual rendering. Finally, I used the Point Sprite SOP to convert the points into sprite-based particles. This allowed for a fluid and organic appearance, giving the particles a more natural and responsive look. The setup also created a flexible structure that could later respond to biodata input, allowing color and movement to be controlled in real ti

Fig 2: Apply texture and readjustment for particle size

Step 3: Setting Up Real-Time Audio Input and Output

To build a system where the plant’s internal signals could be experienced through both visual and auditory channels, I created an audio pipeline using CHOP components in TouchDesigner. This step was essential for receiving real-time sound data and transmitting generated audio from the plant’s electrical signals.

I began by using the Audio Device In CHOP to capture live audio input. This node allowed me to test and verify that TouchDesigner could receive audio either from a local source or from a Python-based MIDI pipeline in later stages. In parallel, I set up the Audio Device Out CHOP to ensure the processed sound could be sent to external speakers or headphones (Fig 3).

During this stage, I used local music files to test the setup. This helped confirm that both the input and output channels were functioning correctly. By completing this loop, I ensured that any sound generated by the plant’s biodata could be immediately heard by the audience, complementing the visuals created in Step 1. This setup allowed the installation to offer a multi-sensory experience, where the audience could hear and visualisation the plant’s dynamic expression in real-time.

Reference:

First Things to Know about TouchDesigner: https://docs.derivative.ca/First_Things_to_Know_about_TouchDesigner

Leave a Reply

Your email address will not be published. Required fields are marked *

*