Quantcast
Channel:
Viewing all articles
Browse latest Browse all 46

Hi Guys

$
0
0

Hi Guys

Please have a look at http://www.youtube.com/watch?v=0P2XJcEPkFU. It is a short promotion video of my application for Intel Perceptual Computing Challenge 2013. The application fully utilizes hand tracking and voice control capabilities of Creative's Interactive Gesture Camera.

The application is build on Intel sponsored Havok 2012 PC XS technology.  Hand skeletal tracking library is used to get finger tip points and palm normal. I found it far better as standard PerC SDK  gesture->QueryNodeData().  These data are inputs to my physical model of hand (skeleton inverse kinematics, rigid body dynamic driving, constraints, etc..).

 The main goal of my application is to show not only full immersion to physical world, but interaction with autonomous AI objects as well (following finger,  hand landing of helibots, dynamic avoidance to hand and following prescribed agent behavior, character can jump to, surf,  walk on virtual hand, go from pinky to thumb etc). The virtual hand in physical world induces the changes in AI/Animation system as well.

 Hope you will like it. 

 Ivan


Viewing all articles
Browse latest Browse all 46

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>