Yes, there is a function to change the hand parameters. see inside the hskl.h header file:
void hsklSetHandMeasurements ( hskl_tracker tracker, float width, float length ); /* Width, length in meters. Defaults to 0.08 and 0.19 */
or in the inline header c++ convenience wrapper (that connects hskl to the library that gets the camera/depth) hsklu.h, this call is passed through to the corresponding c interface via the method:
hskl::Tracker::SetHandMeasurements(float width, float height)
The viewer program lets the user manually adjust the hand size using WASD keys to scale the underlying hand model.
The API/library only supports the two scale factors (overall length and width) and does not provide access to modify individual finger geometry or tweak the allowable joint ranges, etc. Yes, the underlying technology is based on fitting a 3D articulated rigid-body model to a point cloud. Although, rather than being a separate data file, this was included inside the DLL.
This hand skeleton (hskl) library was provided as an experimental released and is not part of the main SDK. It was intended for those interested in early access to full motion-capture hand pose estimation. From the link on the forum that leads to this page it mentions the intended audience: "...At this time, we invite highly-skilled 3D graphics and interaction C/C++ developers to download this library and explore interaction possibilities..." The documentation (page 2), is quite up front on the limits of the tracking technology: "The output of tracking library will not always match the pose of the user’s hand".
The hand used rendered in the provided samples is the hand model used for tracking. The download does not include any higher quality visual demos where a skinned mesh model is rendered using the hand pose provided by the tracking library. However, if one has the art content, it should not difficult to use the hskl skeleton pose to drive it.
Yes, there is a function to change the hand parameters. see inside the hskl.h header file:
void hsklSetHandMeasurements ( hskl_tracker tracker, float width, float length ); /* Width, length in meters. Defaults to 0.08 and 0.19 */
or in the inline header c++ convenience wrapper (that connects hskl to the library that gets the camera/depth) hsklu.h, this call is passed through to the corresponding c interface via the method:
hskl::Tracker::SetHandMeasurements(float width, float height)
The viewer program lets the user manually adjust the hand size using WASD keys to scale the underlying hand model.
The API/library only supports the two scale factors (overall length and width) and does not provide access to modify individual finger geometry or tweak the allowable joint ranges, etc. Yes, the underlying technology is based on fitting a 3D articulated rigid-body model to a point cloud. Although, rather than being a separate data file, this was included inside the DLL.
This hand skeleton (hskl) library was provided as an experimental released and is not part of the main SDK. It was intended for those interested in early access to full motion-capture hand pose estimation. From the link on the forum that leads to this page it mentions the intended audience: "...At this time, we invite highly-skilled 3D graphics and interaction C/C++ developers to download this library and explore interaction possibilities..." The documentation (page 2), is quite up front on the limits of the tracking technology: "The output of tracking library will not always match the pose of the user’s hand".
The hand used rendered in the provided samples is the hand model used for tracking. The download does not include any higher quality visual demos where a skinned mesh model is rendered using the hand pose provided by the tracking library. However, if one has the art content, it should not difficult to use the hskl skeleton pose to drive it.