"Limbniz"
Uses Asus Xtion Pro and OpenNI2 + NiTE 2.2. Simple toy prototype developed to demonstrate some other stuff, controls a bytebeat sound function with three parameters, using directional movement of the hand to increment/decrement them. The detector here is simple, it detects a "long enough" "straight" movement, ie. decides which of the cardinal directions - left/right/up/down/in/out - the vector points to mostly (if its magnitude is big enough) and does this comparison between the last two NiTE hand tracker frames in the X-Y-Z space perceived by the sensor, then adjusts the values when it happens. In this case it's also set to be quite sensitive (5 mm movement threshold per frame, at least two consecutive frames roughly towards one of the cardinal directions, no cooldown period). It could be better but it works surprisingly well. The sensor and code could also do a huge number of skeleton joints similarly but I haven't yet had time to experiment with it, and making it comfortable might be a bit tricky but probably funny. The screen recording ended up poor in quality, sorry. It's much more fun with crisp sound from earphones and when you let the sound evolve long enough. The parameter values are pretty arbitrary, Youtube complains about having brackets in the description so using SHR to mean bitwise shift right, the function played is: t SHR t^3 * a * 2 + (t SHR b & 16) * 2^(t SHR 2) / 4 & c * t (and b is useless), blasted at glorious 8000 Hz sample rate.
Uses Asus Xtion Pro and OpenNI2 + NiTE 2.2. Simple toy prototype developed to demonstrate some other stuff, controls a bytebeat sound function with three parameters, using directional movement of the hand to increment/decrement them. The detector here is simple, it detects a "long enough" "straight" movement, ie. decides which of the cardinal directions - left/right/up/down/in/out - the vector points to mostly (if its magnitude is big enough) and does this comparison between the last two NiTE hand tracker frames in the X-Y-Z space perceived by the sensor, then adjusts the values when it happens. In this case it's also set to be quite sensitive (5 mm movement threshold per frame, at least two consecutive frames roughly towards one of the cardinal directions, no cooldown period). It could be better but it works surprisingly well. The sensor and code could also do a huge number of skeleton joints similarly but I haven't yet had time to experiment with it, and making it comfortable might be a bit tricky but probably funny. The screen recording ended up poor in quality, sorry. It's much more fun with crisp sound from earphones and when you let the sound evolve long enough. The parameter values are pretty arbitrary, Youtube complains about having brackets in the description so using SHR to mean bitwise shift right, the function played is: t SHR t^3 * a * 2 + (t SHR b & 16) * 2^(t SHR 2) / 4 & c * t (and b is useless), blasted at glorious 8000 Hz sample rate.