Classifier for hand states
Please provide a classifier for the hand states (at least for 'open' & 'closed' hands) for each hand. Of course, other hand and finger poses may be added as well. A hand state classifier would provide good means for interaction with virtual objects and UI elements.
This suggestion is referring to the unfinished part of this (now closed) feature request: https://feedback.azure.com/forums/920053-azure-kinect-dk/suggestions/38154784-hands-support-in-body-tracking

Thank you for the interest! If more customers will vote for this issue, we will consider adding the support.
5 comments
-
Huntley GAO commented
This is quite a useful feature to support many other gestures. I'm wondering whether it's already in your backlog, is there any deadline I can expect? greate thanks.
-
Rocky commented
I am expecting for this feature of Azure Kinect
-
MaxPower commented
It would be great to leverage the HoloLens 2 hand tracking API in the Azure Kinect SDK to identify individual/mixed finger combinations and add hand gesture recognition (pinch, spread, swipe, tap, double tap, push, circle, wave, thumbs up, okay, and more).
-
Blair commented
This would be a great feature to have!
-
KKlouzal commented
Having the ability to optionally enable/disable such a feature in the SDK to help reduce processing of unnecessary tasks would be appropriate.