Fully articulated hand tracking and classifiers for hand states.
Please provide a classifier for the hand states (at least for 'open' & 'closed' hands) for each hand. Of course, other hand and finger poses may be added as well. A hand state classifier would provide good means for interaction with virtual objects and UI elements.
This suggestion is referring to the unfinished part of this (now closed) feature request: https://feedback.azure.com/forums/920053-azure-kinect-dk/suggestions/38154784-hands-support-in-body-tracking
Thank you for the interest! If more customers will vote for this issue, we will consider adding the support.
Neo Chen commented
We are interesting in finding a way for people to control the UI elements on Web page.
The movement need to be like using hand as a mouse on screen, so person can use hand to select, click and slide on screen.
Our main course is to replace the touch screen by using hand remote control as touch screen.
Our web application look like this:
David Morasz commented
Look Microsoft already had something like this for Kinect2 which they then never ever released... https://www.microsoft.com/en-us/research/project/fully-articulated-hand-tracking/
but it looks like nowadays it is state mandated that every software solution should be a DNN, and anything else not involving those are thrown into a deep dark pit, and should never be brought up again... </opinionated-section>
Akshay Sudheer commented
It would be very useful if classifiers are added for hand states. Looking forward to see them in the future update.
This would be extremely useful.
Please comment here as well, if you like: https://github.com/microsoft/Azure-Kinect-Sensor-SDK/issues/1053
Jeremiah Baxter commented
Just open and closed hand states with a value that indicates how confident you are in the state would be incredible for use interaction
Marco Giovanelli commented
Please provide a fully articulated hand tracking.
This will be helpful for accessibility use cases (e.g. Sign Language, Virtual Presence...).
This suggestion is related to the closed feature request: https://feedback.azure.com/forums/920053-azure-kinect-dk/suggestions/38154784-hands-support-in-body-tracking
full hand gestures would be very handy. Hand tracking/recognition without the rest of the body being in frame would be great as well
Rob Jellinghaus commented
...Up to 65 votes and it's now the highest "need feedback" issue. Seriously, how many more votes do you need? There were ~30 back in November and now there are double that. Do you need 100+?
Rob Jellinghaus commented
...How *many* more votes? :-) Up to 52 votes and now the 5th highest ranked issue of all.
Huntley GAO commented
This is quite a useful feature to support many other gestures. I'm wondering whether it's already in your backlog, is there any deadline I can expect? greate thanks.
I am expecting for this feature of Azure Kinect
It would be great to leverage the HoloLens 2 hand tracking API in the Azure Kinect SDK to identify individual/mixed finger combinations and add hand gesture recognition (pinch, spread, swipe, tap, double tap, push, circle, wave, thumbs up, okay, and more).
This would be a great feature to have!
Having the ability to optionally enable/disable such a feature in the SDK to help reduce processing of unnecessary tasks would be appropriate.