Brady Gaster is a Christian dad who lives near Seattle, Washington. At work, he and his amazing colleagues work together to make it fun for .NET developers to party in the cloud. At home, he tinkers with MIDI hardware and makes loud music amidst a hurricane of wires.
Update: I've uploaded the code I wrote for this demonstration project to GitHub into my KinectControlledNetduino public repository. I also forgot to mention that the Kinect code makes use of the excellent gesturing engine created by David Catuhe, which you can read about on his blog or download from CodePlex .
I'll put the code up here tomorrow once time and energy permit, but for the time being the title says it all. There's a Kinect, it controls a WPF app, that app sends messages to an HTTP server running on a Netduino, which is connected to a servo.
Using hand gestures like "SwipeRight" or "SwipeLeft," a user can literally wave to the Kinect to tell it how to tell the Netduino server how to angle the servo. Pretty neat, and quite easier than I'd expected. I'll post the code ASAP but for now here's a video demonstrating how it works.