Here’s a short teaser of our live A/V performance system currently under development by us (Ethno Tekh). We’ll be releasing many more videos in the near future, including more performances with much high recording quality (apologies for this one), as well as interviews with both Brad and I outlining and discussing our project and approach.
For the moment, please enjoy the very first video featuring our new system!
I’ve recently teamed up with visual artist and programmer Brad Hammond (XY01) for our project Ethno Tekh. We’ve been slamming it hard and since we’ve teamed up in early August we’ve already had our first public interactive installation ‘public override Trichild()‘, which was a great success, as well as our first performance to an audience of over 3000 people at Microsoft’s TechEd 2012.
We’re focusing on real-time, interactive and generative digital artworks; as well as larger-than-life and futuristic bass music A/V performances, all performed completely live using motion capture and audio-reactive visuals.
Video of the TechEd performance coming soon, but for now, there’s these couple of photo. But head to the Ethno Tekh Facebook page to keep updated on the project. We’ve got some interesting stuff coming up.
Here’s a video from our debut interactive installation ‘public override Trichild()’:
Since April 2011 I’ve been working solidly with the Microsoft Kinect, developing my software, Kinectar, to enable its use as a MIDI controller for performing music live. I’ve done a number of performances around Australia since I started the project, however, it’s safe to say that, although I would consider myself an electronic musician, I’m certainly no dancer. Enter, Paul…
Dancer, Paul Walker and I have joined forces to bring the Kinect controlled music concept into the world of contemporary dance. Recently we obtained a residency at PACT theatre (centre for emerging artists), where we spent the week developing different ways of implementing my Kinect music control system in a dance context.
This is a video someone’s posted on youtube of one of my performances at the EB Games Expo in the Gold Coast over the weekend. I had a ton of fun playing in front of the 15,000 or so fellow nerds over the weekend and checked out all the latest games! Skyrim looks amazing!
Another demo of some recent mappings I’ve come up with for the Kinect. So far it’s been the most complicated setup to learn, but it’s certainly been the funnest. It’s using 8 parameters from my hands; 6 positions, and 2 speed values. Any sample works in this, however this sample was great because of the bass in the guy’s voice, which helped me get some nice bassline-sounding modulations out of it.
A live performance I did on June 1st @ Microsoft’s REMIX11 conference using my custom Kinect/Ableton/Max setup. I’ve moved on from just controlling a bassline, and now have it setup to a whole assortment of virtual instruments and loop them live to create entire tracks.
- Ableton Live
- NI Massive
- Max/MSP (making useful MIDI data out of the hand coordinates from OSCeleton)
- Behringer FCB1010 (controlling Ableton, ie. looping, switching the Kinect’s instrument etc)
Toying around with live vocal sounds done by a friend, Thom. I fed the voice through a chain of effects (delays, reverb, granular processing) that I played with at the same time. Some nice moments among it.