Earlier in the week I was able to make an object in Looking Glass move in response to my right hand movements. The way that I was able to do this was not a good long term implementation, but at least provided a proof of concept that items in Looking Glass can move in reponse to kinect data. Today, using a better implementation, I was able cause a character's right and left hands to move in response to a user's right and left hands. The arms of the character were distorted, but it was a good step. Next week I will continue to work on these body movements and on applying a transformation to the kinect joint information so that the character's movements look normal (without distortion).
As I have been running into technical challenges between the Kinect and Looking Glass, I have been giving alot of thought to movements and events that will be useful in Looking Glass therapy games. I first explored Kinect gestures. A gesture in this case is a specific movement that can be detected. A gesture is not a continuous movement, it is a movement that has either happened or not. For example, when a user moves their right hand above their head, an object in Looking Glass moves up. When the user moves their right hand below their head, the object moves down. From a therapy standpoint, the goal for stroke survivors or those with cerebral palsy is often to build strength and increase range of motion. Events such as Right Hand Above Head could be used in a Looking Glass game to address both of these goals. Various events could be defined and used for those with varying abilities. For example, a user with limited range of motion could play a game with a Right Hand Above Hip event, where just raising their hand above their hip would trigger the object in Looking Glass to go up, and lowering their hand would cause the object to go back down. Right Hand Above Spine, Right Hand Above Shoulder, and finally Right Hand Above Head could then be used for those with varying abilities. This concept of gesture movements could also apply to left hand up and down movements, as well as side to side movements with either arm (where various gestures could require a certain amount of movement to make an object move side to side).
Although gesture based events seem to provide some utility, what might be better is having an object in a game that tracks continuously with the user's hand movements (or possibly elbow movements, depending on the desired exercise). In a game environment, an object could move up and down in direct correspondence with the user's up and down hand movements. In order to accomodate for and also gently increase range of motion, a factor could be applied to the user's movements. For example, someone with limited range of motion could choose a factor of 5 so that their upward hand movement multiplied by 5 would be reflected by the object on screen. This would cause a relatively large upward movement in the object for a small upward movement of the user's hand. As the user's range of motion increased, a smaller factor could be applied.
This week I will continue to work on moving object and characters in Looking Glass with the Kinect, and will be thinking in terms of how these movements can be used in therapy games.