Stream Analytics
Azure Stream Analytics is a fully managed real-time stream computation service hosted in Microsoft Azure, which provides highly resilient, low latency, and scalable complex event processing of streaming data. Azure Stream Analytics enables developers to easily combine streams of data with historic records or reference data to derive business insights easily and quickly.
-
Take input from CosmosDB
Currently Stream Analytics accepts data from Blob storage. Support to get data from DocumentDB should be added.
166 votes -
Time series database sink
It would be great to be able to stream the data into a time series database. Currently Azure does not have support for time series, but it would be nice to see support for KairosDB or OpenTSB
48 votesunder review ·AdminAzure Stream Analytics Team on UserVoice (Product Manager, Microsoft Azure) responded
We see more and more demand for output to time series databases so reopening this item.
Let us know which database you are using. -
Allow REST Endpoints as reference data input
Possibility to define REST endpoints as reference data inputs. Including refresh rate settings similar to SQL Database.
41 votes -
Output to ADLS gen 2
Support for output to ADLS gen 2 in addition to existing inputs.
22 votes -
Support SQL DB stored procedures as ASA Output
Allow native support to Stored Procedures to be called from Stream Analytics. This way, we don't need to build custom solutions to trigger Stored Procedures when ASA generates O/P.
16 votes -
Invoke Functions in Batch & Parallel
Current Functions Output executes one by one and ensures ordering. In our scenario, ordering is not important & we would like output to support.
- Batching & invoke function for each batch (Array as input)
- Invoke functions in parallel for non-ordering scenarios (may be as a configuration for output)
12 votes -
Output to PostgreSQL
Native output to Azure Database for PostgreSQL
6 votes -
Job Diagram
This Job Diagram holds real promise as visual intuitive view of job execution and metrics especially during development. In it current form it's anything but that. When I select an output it seems to default to out of order events only.
3 votes -
Output to Azure Postgres
Similarly how Azure SQL is supported it would be great to have an output connector to Azure Postgres.
3 votes -
Input from Cosmos DB changefeed
Support direct input from Cosmos DB changefeed.
3 votes -
Foresee event grid output for Azure Stream Analytics
Azure Event Grid really helps in building distributed, event-driven, applications on Azure. Having out of the box support to publish selected events from Azure Stream Analytics to Azure Event Grid would be awesome!
To achieve this today, I have to call (and deploy + pay) an Azure function that takes the message and call the actual Azure Event Grid API... It would be so much easier to have this out of the box and this would benefit for future out of the box event grid subscribers.
3 votes -
Add support for compressed output
I have historic data imported through spark in avro format to our Gen 1 Data Lake. I am now using ASA to stream the live data.
Spark defaults to uses snappy to compress avro files where as ASA doesn't compress. This has resulted in a 10 fold increase in storage size and a slow down in access speed. Can you add support for compression of the outputs in ASA?3 votes -
reference data
Allow TIMESTAMP BY OVER with reference data
3 votes -
Monitor overall SU usage over time, not just per individual SA job.
Spinning a new stream analytics job can occur at many times. Sometime you risk reaching your SU limit (200 by default). It happened to me... I am missing the ability to monitor how many SU I am currently using. Something similar to SU% utilization that is available per SA job- but on the subscription level.
1 vote -
1 vote
-
Ability to see the events
In the output boxes, one should be able to extract the top N events to see what data they produced
1 vote -
Support IoTHub.* in Azure Stream Analytics Tools for Visual Studio
When authoring a query in the portal, I use the IoTHub object to get at items like IoTHub.EnqueuedTime (discussed here: https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-define-inputs#stream-data-from-iot-hub) These do not work in the ASA tools for Visual Studio.
When you use stream data from an IoT Hub, you have access to the following metadata fields in your Stream Analytics query:
Property
Description
EventProcessedUtcTime
The date and time that the event was processed.
EventEnqueuedUtcTime
The date and time that the event was received by the IoT Hub.
PartitionId
The zero-based partition ID for the input adapter.
IoTHub.MessageId
An ID that's used to correlate two-way communication in…1 vote -
How to check for data loss in Stream Analytics
◾️Configuration
In the steps Azure IoT Hub-Azure Stream Analytics-Azure Data Lake Storage, the data of Device-to-Cloud Messaging of Azure IoT Hub is accumulated in Table Storage and Data Lake Storage.◾️Problem
When Runtime Errors occur in Stream Analytics in this configuration, I can see the details of the error in the activity log, but I don't know how to determine if this error is causing data loss.◾️Request
Instead of comparing the number of input and output data of Stream Analytics, I would like to compare the contents of data output from the IoT Hub with the contents of data…1 vote -
Duplicate Outputs
If i want to insert data to sql table from different query then query editor won't allow to use same output to use in different queries even though destination must be same. It will be duplication of output.
1 vote -
SQL connection Failure
When output failed due to SQL server unavailability then SA doesn't have re connection logic to connect back once SQL Server is up.
1 vote
- Don't see your idea?