Abstraction Layer
- Oct 19, 2012
- 0 Comments
After running into additional issues while trying to integrate the Kinect (jnect) with Looking Glass, I have spent the past week building several classes to bridge jnect and Looking Glass. These new classes will gather data from the Kinect in a format that is usable by Looking Glass. I was able to run a test in this Abstraction Layer and show that data is being gathered from the Kinect. The next step is to be able to use this data in a Looking Glass event.
I am currently working on a Right Hand Above Head Event, which will cause an object to move up when the user's hand goes above their head, and move down when their hand is down. I am also working on a Right Hand Move Event that will cause an object to move in response to a user's hand movement. Specifically, the object should move in x, y, z coordinates that correspond to the hand's x, y, z coordinates. Once these events are functional, the next step is to map from the user's moving body parts to a Looking Glass character's body parts so that the character will move on the screen to reflect the user's movements.
Comments
Log In or Sign Up to leave a comment.