Joining in from Michigan, USA

fwc

New elf
Joined
Jan 8, 2013
Messages
2
Hi all - I am a newbie specifically interested in working with RGB pixels - GE ColorEffects to be precise to create what I've seen termed variously as "color organs" or "music visualizers", i.e. I'd like to drive lots of individually addressable RGB pixels with real-time audio (typically music) for our backyard entertainment. So far I've identified the SanDevices E682 board as a viable controller, as it directly supports the GE ColorEffects device and is a reasonable price in USD. Beyond these components however I'm a bit miffed about the upstream generation of the incoming (E1.31) from the audio source - mathematically I understand the underlying DFT/FFT and digital filtering in general are applied - but haven't yet sorted out what this means in terms of lighting control software, and exactly how the audio stream (MIDI? Apple Lossless? MP3? Flac?) is mapped on to the device's color spectrum to generate the E.31 output. I'd like to accomplish this on a personal budget without tipping into heavyweight costs - as I'm doing this for fun and to entertain our neighborhood over the summer months (of course along with some kind of cool Christmas lighting) and perhaps with an ambient component (we live in an urban university environment, lots of possible audio event sources).
 
welcome%20to%20acl.jpg
 
Welcome to ACL fwc.
To do what you're talking about the only software that I know of is Madrix. It's waaaaaaaay above what most would consider a personal budget though. It does real time effects whereas most of what is done by the peeps here is sequences that have hours or days spent on a couple minute song.
 
Welcome to ACL!


Have a look at Lightjams that might do what you are looking for.


Also if you haven't already purchased the GECE you may want to look at WS2811 pixel nodes like these and the J1sys controllers.
 
Thanks all for the welcome! I'll poke around on the forum and perhaps start a new topic up to share feedback on my hunt for real-time pixel software control - maybe others might want to incorporate a dynamic response element into more typical sequence programming as a way of creating interactions using ambient sensors (voices, proximity, movement, etc.).
 
Welcome to ACL.

I'm not an expert on the topic but you can sorta sequence the way you are talking about in LOR's Superstar software. I was down to the last minute for this past year trying to add 5 Color Ribbons into my songs. I finished some of the songs manually and I used what they call the Instant Sequencer to do a couple of the songs. You can configure it to apply different patterns to different ribbons based on a lot of different factors like amplitudes and frequencies. I barely touched the surface of what was available but it did produce effects on the ribbons that you could tell was responding to various beats and sounds in the audio file. The limitation was I could only pick one type of effect per ribbon group. So what I did was selected just a piece of the song and did the instant sequence and then moved onto another section of the song and changed the effect that would be randomly applied to each ribbon group. Even if you don't have ribbons you could use that tool and export the data as an LMS sequence and then paste it on top of other RGB elements. I'm planning on trying to move from the LOR software to LSP but I will probably still use the Superstar sequencer for some elements and export/import the data over to LSP...assuming I can get the LSP import working for that data.

Gil
 
Back
Top