Week 10: Test and Debug My Immersive Installation & Revisit my Gantt Chart
I focused this week on testing the full pipeline from data collection to visual output. The goal was to ensure that the real-time connection between the plant’s bioelectrical activity and the digital environment remains stable and expressive.
Step 1: Activate my Arduino Bioelectrical signal Collection Envrioment
The first step was to activate the Arduino-based biodata collection device. I tested the system’s ability to detect and translate the plant’s electrical signals into MIDI notes (Fig 1). This phase was essential to confirm that the hardware setup was functioning correctly and that any bugs such as unstable signal, timing delays, or connection errors could be identified and resolved early in the process.




Step 2: Transfer real-time bioelectrical signal generative MIDI note into sound output
Once the device was responding as expected, I moved on to integrating the data stream into the digital environment. Using PyCharm and Jupyter Notebook, I opened a serial connection with the Arduino and began receiving the real-time signal data. These signals were then converted into corresponding musical notes audio output (Fig 2).




To connect this system with the TouchDesigner visual pipeline, I routed the MIDI-generated sound through BlackHole, a virtual audio driver that channels sound directly into TouchDesigner’s Audio Device In component. This setup allowed the plant’s bioelectrical activity to be heard and simultaneously visualised in real time (Fig 3).
As the sounds flowed into TouchDesigner, they influenced both the particle behavior and the color modulation of the virtual plant. This final test ensured that all parts of the system including biological sensing, audio generation, data routing, and visual response were working in harmony.




Setup Process Recording Video
By testing each component individually and as part of a unified flow, I was able to detect issues early and confirm the system’s readiness.
Conclusion
After completing the setup, the plant’s bioelectrical signal successfully generated real-time sound that influenced the behavior of the virtual plant in TouchDesigner. The sound created by the plant did not remain as audio alone. It became a dynamic force within the virtual space.
The pitch and duration of each note directly affected the color intensity of the particles, as well as the scale and speed of their dispersion. Higher pitches triggered brighter colors and wider spread, while longer notes created smoother and more sustained visual movement. Through this connection, the virtual plant became a responsive entity, shaped moment by moment by the living signals of its physical counterpart.
Through this process, I learned how to connect hardware sensing with generative visual systems, and how to translate invisible biological data into multi-sensory expression. I also gained practical experience in resolving technical issues across platforms. For example, configuring BlackHole as the default audio output in Python, and troubleshooting the communication between Jupyter Notebook, BlackHole, and TouchDesigner’s Audio Device In and Out helped me better understand how different software environments can work together in a real-time data pipeline.
Revisit the Gantt Chart
Reference:
Multi Output Device MAC: https://github.com/ExistentialAudio/BlackHole/wiki/Multi-Output-Device
How to set up BlackHole Audio on a Mac L https://cloud.wikis.utexas.edu/wiki/spaces/comm/pages/33425619/How+to+set+up+BlackHole+Audio+on+a+Mac
Leave a Reply