Azure Kinect DK
Azure Kinect DK is a developer kit and PC peripheral with advanced artificial intelligence (AI) sensors for sophisticated computer vision and speech models. It combines a best-in-class depth sensor and a spatial microphone array with a video camera and orientation sensor—all in one small device with multiple modes, options, and SDKs.
-
Please allow for macOS support. ( should be minimal port from linux ).
I have been trying the linux install instructions on macOS.
It seems that most steps in CMake work, but the DepthEngine is Windows/Nix only.
Would be amazing if you could release the DepthEngine for macOS.
The rest should be achievable by the community / via Pull Request etc.A huge number of our community ( openFrameworks ) would make good use of macOS support.
Thank you!
Theo253 votes -
ARM Support for Linux
Support the Azure Kinect Sensor SDK for Linux on ARM on NVIDIA Jetson Nano or similar embedded dev platforms.
164 votesHello Kinect users! We released the alpha version of release 1.4 that includes ARM support. This is release is only Sensor SDK on ARM boards, but Body Tracking SDK is coming soon. You can see more details at https://github.com/microsoft/Azure-Kinect-Sensor-SDK/issues/1093
-
Python support
Let developers use Python on Windows and Linux to build machine learning applications based on Azure Kinect more easily.
118 votes -
UWP Support
Enable UWP applications to get access to the Azure Kinect Sensor data.
68 votes -
Please provide a Visual Inertial Odometry (VIO) method to get position and orientation of the camera
It is very important for vision application (robotics, mobile tracking) to provide a VIO (visual inertial odometry) method in order to get position and orientation of the camera.
All the hardware components are already embedded in the Kinect Azure (IMU, depth camera, color camera), it would be a shame not to use them.
All the competitor already provide such feature with the SDK of their depth cameras (intel realsense, Zed stereo-camera) and a lot of AR mobile development kit also (ArKit, ArCore,...).
Do you plan to develop such feature ?49 votes -
Azure Kinect DK Unreal 4 Support
Support for Unreal 4 would be greatly appreciated, even if it is something as a curious pass. Unity is easy and all, but shouldn't be the only one to get your attention.
I myself have gotten body tracking to work in UE4, but I'm like an amateur trying to scuba for the first time in an underwater cave. Conversions to drive a virtual body turn out to be something that is beyond me. On the flip side, so long as I'm not asking for direct rotations, my little plugin works fairly well.
31 votesWe are looking for more community feedback to prioritize this feature request.
We also encourage you to take advantage of our open source sensor SDK to implement new features. -
No microphone access available in API recorder (k4arecorder).
There is no ability of your API recorder to record video and audio simultaneously from this camera. Whether or not another program can separately access the microphones through DirectShow or WASAPI is irrelevant. If it's not natively synced with the video through the API recorder then it doesn't matter, especially with multiple cameras, where that desynchronization will only compound. What's worse, is that the relevant programming for synchronous microphone access is available across platforms on the viewer, but is not mirrored in the recorder. What's the point if we can't even record audio and video together?
29 votesWe are looking for more community feedback to prioritize this feature request.
We also encourage you to take advantage of our open source sensor SDK to implement new features.
-
IMU example or API
Please provide SDK API or at least example how to get the device rotation and change in position, according to the raw IMU data coming from device.
22 votesThank you for your question! It would be great if you can provide how the IMU data will be used.
Thanks! -
Support Azure Spatial Anchors
Given how these two features were marketed together, and conceptually work together in an obvious way, it's very surprising that the Azure Kinect DK doesn't work with Azure Spatial Anchors.
We were counting on this for our product when the technologies were first announced. Disappointing to see spatial anchors only work on phones.
22 votesWe are looking for more community feedback to prioritize this feature request.
-
Adjustable MJPG compression
The MJPG is the fastest format for our works, the problem is the file size, as it said in the sdk (k4atypes.h)
"Each MJPG encoded image in a stream may be of differing size depending on the compression efficiency."
But, for now, we can't adjust the compression factor, it would be very helpful to integrate a simple method to make adjustable the compression of the MJPG format.16 votesWe are looking for more community feedback to prioritize this feature request.
We also encourage you to take advantage of our open source sensor SDK to implement new features. -
Azure Kinect Andriod Support
Azure Kinect Android Support. There are a number of fitness devices that could leverage body tracking but those run on Android based devices today and would have to re-platform to leverage Kinect.
13 votesThank you for the interest! If more customers will vote for this issue, we will consider adding the support.
-
Kinect data on GPU with `gpu::GpuMat`, `ogl::Buffer`, `ogl::Texture2D`, or OpenGL handles
Add access to Kinect data via cv::gpu::GpuMat, ogl::Buffer, cv::ogl::Texture2D, and/or OpenGL handles to keep data on GPU and greatly reduce GPU<->CPU memory churn for image processing, computation, etc. As I so far understand the codebase, almost all work by this SDK codebase is done on the CPU. (depthengine is not part of this SDK codebase)
For the same reasons that depth engine is written for GPU usage...our apps need to use the GPU. Developers on tiny industrial platforms up to workstations need the Kinect data to stay on the GPU to scale across the variety of computing platforms. Forcing download…
11 votes -
Websocket sample code
Provide a code sample to perform websocket streaming of video stream. This sample was in the very old kinect SDK and a good kickstart to play mix kinect and web
10 votesWe are looking for more community feedback to prioritize this feature request.
We also encourage you to take advantage of our open source sensor SDK to implement new features. -
Hololens unity3d support
streaming the live colored mesh created by Azure Kinect device using the unity3d hololens application
10 votesWe are looking for more community feedback to prioritize this feature request.
We also encourage you to take advantage of our open source sensor SDK to implement new features. -
Replay recordings directly to sensor
Currently the replay API supports the direct access to captures. For integration test purposes, it would be beneficial if recordings could be replayed "directly" to the sensor, i.e. as if the recording was coming directly from the sensor, much like the KStudioClient and KStudioPlayback for KinectV2.
7 votesThank you for the interest! If more customers will vote for this issue, we will consider adding the support.
-
Add Audio Direction of Arrival / Sound Source Localization to the Sensor SDK
Similar to what was already suggested here, but prematurely declined: https://feedback.azure.com/forums/920053-azure-kinect-dk/suggestions/38328904-add-sound-source-localization-tracking-separatio
The Microsoft Speech Devices SDK does not provide low-level access to audio direction of arrival (DoA) required for sound source localization. It would be great if the Azure Kinect Sensor SDK could provide this beam information as the old Kinect SDKs (1 and 2) provided. This should be surfaced as a continuous stream of the direction of audio arrival in the horizontal plane (0 - 360 degrees).
This information would open up a ton of useful applications for the microphone array in conjunction with body tracking, e.g., for interactive…
5 votes -
Please: Implement a scanning app like was done for the Kinect2 as well as a way to extract 8bit gray scale maps from the 16 bit png files
I've posted this on the Developer Forum and was advised to post here:
I wish there is a high bandwidth pipe between Customer voice and the Development teams at AzureKinect
More on the (logical) request here:
https://github.com/microsoft/Azure-Kinect-Sensor-SDK/issues/782#issue-4967702795 votesThank you for the interest! If more customers will vote for this issue, we will consider adding the support.
-
Please allow to access phase/ amplitude and raw data for Azure kinect depth image
I can't find to access the raw data of the depth image(amplitude/phase) for Azure Kinect. I know Microsoft provides a depth engine for Azure Kinect sensor SDK. However, in my research area, I have to access the raw data of the depth map. Please make a way to access the raw data.
3 votes -
Debian Stretch and Buster
Can I please request support on Debian Stretch and Debian Buster?
3 votesThank you for the interest! If more customers will vote for this issue, we will consider adding the support.
-
Spatial Capture
We have customers who would like us to be able to capture very high-fidelity 3D scans of architecture / real estate.
This would be used to preview / sell the spaces to customers who are unable to physically visit the locations.3 votesThis should be possible with the Azure Kinect DK.
Please let us know if there are specific features of the SDKs or HW which are needed for you to build the solution.
- Don't see your idea?