Hey everyone,
Been experimenting with bridging Resolume Arena and TouchDesigner lately and I'm absolutely blown away by the possibilities. Imagine combining Resolume's real-time performance and intuitive interface with TouchDesigner's unparalleled generative visuals and data integration.
I'm curious if anyone else has explored this combination? I'm particularly interested in techniques for sending data (audio analysis, MIDI input, etc.) from Resolume into TouchDesigner to drive generative content, and then feeding the resulting visuals back into Resolume as a source.
Currently, I'm using Spout/Syphon for video transfer and OSC for data. It's working reasonably well, but I'm wondering if there are more efficient or elegant solutions out there.
Also, any tips on optimizing the workflow for live performance would be greatly appreciated. Latency is a concern, and I'm always looking for ways to minimize it.
Looking forward to hearing your experiences and ideas! Let's brainstorm some creative possibilities. Has anyone successfully used this setup for a live gig or installation? Share your insights!.