Hands support in Body Tracking
Please include at least the hands into the tracked body joints list (even Kinect SDK 1.x tracked hands), as well as the hand state (open or closed) for each hand. This would provide good means for interaction with virtual objects.
This feature has been added to the body tracking SDK in v0.9.4.
You can find the download for this version here: https://docs.microsoft.com/en-us/azure/Kinect-dk/body-sdk-download
Akshay Sudheer commented
Thus topic cannot be marked as Complete. The hand state classifiers like open/close/pinch is not yet added in the current SDK.
Sorry, but the hand states are still missing in BT SDK 0.9.4. Isn't it too early to close the feature request?
Toby Sharp (MSFT) commented
It would be good to separate this out into 3 different suggestions that people can vote on:-
- A hand state classifier (e.g. open/closed/lasso)
- Improvements to tracking the "mitten" hand joints given since 0.9.4 (hand, finger tip, thumb tip)
- Fully articulated hand tracking
Rob Jellinghaus commented
Thank you for adding hand joint support in 0.9.4!
[Feature] Add hand joints support. The SDK will provide information for three additional joints for each hand: HAND, HANDTIP, THUMB.
[Feature] Add prediction confidence level for each detected joints.
Can you say whether you are now working on finger pose? (e.g. the previous Kinect's "open" / "closed" / "pointing" finger recognition?)
Giacomo Inches commented
This is a feature could not be missed in the new Kinect as it will empower many accessibility use cases that are now too difficult to be addressed without proper hands and finger tracking e.g. Sign Language reproduction, Virtual Presence, ...
Looking forward to get the hands added to the mix. I would be a game changer! Also, I really need the open/closed hand state for all the interactions in my product.
Agree. Please add at least hand position to the solve to get better character posing. Thereafter, agree with the basic hand positioning next (open/close/soft open/point). Lastly fingers. Ok for 1-3-5 finger solve, but would be great to get all fingers.
Muadzir Aziz commented
Yes agree...it is suffice enough to have hands tracking. But even better if can articulate all fingers
Agree with Chris, I hope to see the same functionality of hand gesture in Kinect 2 SDK.
Rob Jellinghaus commented
Even minimal hand pose tracking (open / pointing / closed), as in Kinect V2, can be used to do very meaningful interactions. Limited hand pose tracking is still useful, and significantly easier to recognize.
David Kelly commented
I may be wrong but believe the basic hand tracking (with thumbs) contributed to enabling forearm twist orientations being supported with Kinect V2. I've not seen the same desirable twisting with K4A joint orientations. It would be great to get this back. Finger tracking etc is likely out of scope but having an avatar display with palms / forearms correctly aligned really aids in user confidence in the tracking abilities.
Jasper Brekelmans commented
Right now tracking stops at the wrist, for many scenarios it would be highly beneficial to get an estimate of the hand rotation.
Tracking of the finger base/tip would allow for basic hand rotations to be calculated, just like with older Kinect SDKS (and competitors).
A thumb estimation (like Kinect v2) would allow for forearm roll/twist estimation as well. (although that may be quite noisy and may need to be smoothed (more than the rest of the arm joints))