it would be nice to use a table in a SQL Azure DB as reference data349 votes
We are looking forward to releasing this feature soon. Please reply to this thread if you are interested in a private preview program
At present, Stream analytics throws memory exception when there is not enough streaming units. It should automatically scale based on the streaming unit requirement.194 votes
Currently Stream Analytics accepts data from Blob storage. Support to get data from DocumentDB should be added.140 votes
it would be great to have the possibility to create a dataset in PowerBi Desktop and use as output in ASA.
Currently you have PowerShell and REST APIs for creating outputs, but I don't believe it supports creating a Power BI output. You can only do that via the management portal. I need to be able to do this programmatically including the authentication with Power BI during setup.95 votes
We are working on this now. Hope to wrap it up by mid 2018.
For example when we want to duplicate data from one event hub to another (for example from production to development event hub), the only option is to copy the event data payload, but no option to copy the event data properties as well.
see http://stackoverflow.com/questions/35590255/stream-analytics-and-azure-eventhub for similar issue others had.72 votes
We are starting the work to write custom message metadata.
For reading metadata from the incoming message, this is already possible using the GetMetadataPropertyValue function. More information is available here: https://msdn.microsoft.com/azure/stream-analytics/reference/getmetadatapropertyvalue
Currently we are not able to create Azure Data Lake (or Power BI) output's via ARM templates because ADL and PowerBI require interactive authentication currently.
For our continious delivery we would like you to enable this.69 votes
Last week at Ignite we announced the preview of Managed identities for Azure resources (formerly MSI) based authentication for egress to Azure Data Lake Storage Gen 1. This will enable this scenario.
More info and link to link the preview on this blog post: https://azure.microsoft.com/en-us/blog/eight-new-features-in-azure-stream-analytics/
Call from my query to a REST endpoint to invoke custom code.66 votes
Would like to push my data from on prem SQL -> Stream analytics + ML model -> Output of ML model back to on-prem SQL table -> Host power BI on corp. By this way, any PII/HBI data can still be on-prem and other numeric data can flow via the cloud and back. If the channel from cloud to on-prem sql can be secure and easily established, it would greatly help.52 votes
At this time, we don’t plan to implement this feature, notably due to latency requirements with our sources and sinks.
Please let the feedback keep coming, we are listening!
Adding HDInsight (HBase) as an output for an job. HBase provides random, realtime read/write access.51 votes
Stream Analytics is great - but it's a bit much to pay for a full unit hour when you have very low processing requirements. I've got a couple demo and lab environments I'm working with now and have to keep stopping the jobs as each costs about $80/month to run.
The simplest fix for this would be to have a smaller unit size ("shared" unit?) to make it easier to get started with projects.46 votes
we will consider this request.
Azure Operational Insights provides a very fast search and indexing mechanism to data. Its a perfect visualization experience on top of the data we are sending from EventHub->StreamAnalytics43 votes
At this time, we are not planning to add a direct connection to Elastic Search. However, as a workaround, you can output data to blob storage or SQL tables, and index them with Elastic Search. More information here: https://docs.microsoft.com/en-us/azure/search/search-semi-structured-data
Please let the feedback keep coming. We are listening!
Add a coalesce function so that nulls can be in-place converted to some default value41 votes
We have started working on it and will be available by end of 2018.
I need to insert the timestamp as a column in a Azure Table with the following .NET format ("yyyy-MM-dd-HH-mm"), I thought the CONCAT and DATENAME/DATEPART functions would help me, and I ended up with this, but it is not producing the output I need:
CONCAT(DATENAME(yyyy,System.Timestamp),'-',CAST(DATEPART(mm,System.Timestamp) AS NVARCHAR(MAX)),'-',DATENAME(dd,System.Timestamp),'-',DATENAME(HH,System.Timestamp),'-',DATENAME(mi,System.Timestamp)),
"2015-10-12T05:17:37.807Z" is formatted like: '2015-10-12-5-17" and I am expecting "2015-10-12-17-17"40 votes
An option to partition the output Blobs according to GROUP BY Statements (or something similar) would be really useful. For example group by customer_id -> results in prefixed blob CSVs for each customer. Something like that would be greatly appreciated!39 votes
This feature is now available in (private) preview. See more info here to register for the preview: https://azure.microsoft.com/en-us/blog/4-new-features-now-available-in-azure-stream-analytics/
Would be great to support Parquet file format for output from Stream Analytics. This format seems to be trending up in popularity within the Spark and Hadoop ecosystem.39 votes
We want incomming data for multiple of our customers input through one event hub and then output to a table per customer. The tables don't need to be created by stream analytics, we can do that from code when the customer creates an account with us. I just need to be able to direct the output to the correct table from my Stream Analytics query.39 votes
Currently there is an output type of Service Bus Topic. Would be great if we could control which of the data from the stream get into the values of Properties. Then the clients subscribed on that Topic could use SqlFilter to listen only on data they need.
20 fields come with the telemetry data where DeviceID and DataPointID are to be put into Properties.38 votes
We hope to make this available by Fall 2018
- Don't see your idea?