Modified the pre-built Skeleton Viewer code in the Kinect SDK to demonstrate control over individual points of articulation... by randomly changing the color and size of the head Point.
I am now able to do comparisons of the position of Points of Articulation. The left arm becomes thicker as the the left hand is higher than the left shoulder. While positive mundane, this ability will potentially be the linchpin to large portions of my interface. Being able to detect and compare the position of joints in the arm will allow me to allow people to have special commands that are enabled by holding your an arm a certain way.
Ideally, there will be pre-built gesture support (that I haven't yet investigated), but presuming there isn't (or presuming that the gestures I want to use aren't supported), I now know I can build them myself. It probably isn't necessary to say that's a big deal.


No comments:
Post a Comment