Stream Analytics

Azure Stream Analytics is a fully managed real-time stream computation service hosted in Microsoft Azure, which provides highly resilient, low latency, and scalable complex event processing of streaming data. Azure Stream Analytics enables developers to easily combine streams of data with historic records or reference data to derive business insights easily and quickly.

  • Hot ideas
  • Top ideas
  • New ideas
  • My feedback
  1. SQL Managed Instance Output

    Please investigate supporting SQL Managed Instance (as opposed to single DB) as an output for Stream Analytics.

    10 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  2. Please restore "Output Last Processed Time" metric at job diagram

    I have check "Output Last Processed Time" everyday at job diagram.
    So, I want to be restored "Output Last Processed Time" metric at job diagram.

    10 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →

    Hi, thanks for your comments.
    We are making some changes to the job diagram, and unfortunately had to remove this information for now. We’re investigating how to add it back.
    What is your usage of Output Last Processed Time? If it’s to monitor if your streaming pipelines is acting as expected, you may want to use the “output watermak delay”. When this number goes up, it means your pipeline is getting delayed. More info here: https://azure.microsoft.com/en-us/blog/new-metric-in-azure-stream-analytics-tracks-latency-of-your-streaming-pipeline/
    Thanks,
    JS

  3. Fix the Ci /CD NuGet package so that it works in VSTS

    Please fix the CI / CD NuGet package. It currently breaks in VSTS failing to load the Newtonsoft.Json package.

    10 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    3 comments  ·  Flag idea as inappropriate…  ·  Admin →
  4. Azure Stream Analytics Data Lake Store output file name

    It would be nice to add the ability to provide a file name prefix to the output file for ADLS. If I have multiple ASA jobs pushing data to the same folder, I cannot determine the contents of the file unless I open it. The file name that is generated is of no use.

    9 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  5. Allow Creation of Azure Storage outputs with Shared Access Signatures

    Currently creating an output to an Azure Storage account requires the account name and key.

    It would be really useful to be able to create those outputs using a SAS URL/Token to avoid unneeded sharing of keys either when they are recycled or when creating a new Stream Analytics output.

    9 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  6. High Resource Error Message

    During job execution for message on high resource utilization, the level shouldn't be "Informational" but Critical since it will lead to a delay or failed execution

    9 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  7. Input from Cosmos DB changefeed

    Support direct input from Cosmos DB changefeed.

    9 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  8. Add KQL support

    KQL is such a powerfull language for working with time series data. In my opinion it is a very clean and transparent language for defining queries. It would be great to have KQL-like support for Stream Analytics on the edge and in the cloud.

    9 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Query Language  ·  Flag idea as inappropriate…  ·  Admin →
  9. Convert function is missing for Hex values.

    Convert function is missing for Hex values. I want to do like... Convert(INT,Convert(VARBINARY,SUBSTRING(payload,3,2),2)) as Humidity.
    Also, Aggregate function should work for case like.... AVG (Humidity) AS [AverageHumidity]

    9 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  10. Ability to use more than one source for reference data

    It would be nice to be able to use "reference" data from more than one source. I've aware that the possibility of using SQL table and Storage table is under consideration but this is different.

    Usage scenario for an IoT application: you have inbound sensor data that you need to process. You have reference data (such as thresholds that are different for each sensor) and you have customer configurations that you need to check against the incoming data.
    Some of that data might be stored in a SQL table and other on Storage or is updated in Blob storage.
    In…

    9 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  11. Scaling up Streaming units should be transparant and not require the job to be stopped

    Currently adjusting the streaming units requires the job to be stopped within the portal. It would be very beneficial if the scale could be adjusted on the fly and not require a job stop/start.

    8 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  12. Input from Azure Data Lake Storage

    Data Lake is not added as output which is great. But it should be added as input as well. Eg. we continuously save data to Data Lake Storage, and want to stream these to Power BI.

    8 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    2 comments  ·  Flag idea as inappropriate…  ·  Admin →
  13. R instead of SQL for Azure Stream Analytics queries

    Microsoft R is a must for queries of stream analytics!

    8 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  14. Automated Query Testing

    Testing ASA queries requires manual interaction through the portal as described in https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-test-query

    It would be nice to be able to create automated tests.

    ie: an API to pass test input that returns output.

    8 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  15. Define SerializerDeserializer for input reference data from blob storage

    Possibility to define (in some language) a Serializer/Deserializer for proprietary input file formats from reference blob data.

    As it is today I had to write a cloud service that parses the data and send it to an eventhub but if stream analytics could accept any file format this way, provided that I somehow define a parser for my special data format, than I could skip that step.

    7 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    under review  ·  0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  16. date

    Enable dates to be provided in local date time rather than UTC. Useful for outputs to PowerBI to create reports that reflect the local time the data was captured and not UTC

    7 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  17. running aggregates

    I'd like to be able to sum counts from all records since a fixed time in the past, rather than use fixed windows.

    This would be like a sliding window, but with a fixed beginning, and an end that follows the last sample.

    Input stream

    sesID      TIME    count
    

    session1 time 1
    session2 time 2
    session2 time 3

    The first step would be to create an subquery that uses LAG to find where the sesID changes, and then store that time.

    That time then could be used as the start of the window, let's call it fixedStartWindow(StartTime, last(time)) and allow group…

    7 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  18. Allow IoT Hub Feedback Endpoint to be defined as Input

    When configuring an IoT Hub as an ASA input, the IoT Hub endpoint that is used for the data stream is the cloud to device endpoint. Given that the delivery Feedback endpoint /messages/devicebound/feedback is an AMQP endpoint it would be nice to be able to process these messages using ASA. Suggestion is that the Portal UI allows a user to specify either the primary device-to-cloud or the feedback endpoint when creating an IoT Hub input.

    7 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  19. Add support for compressed output

    I have historic data imported through spark in avro format to our Gen 1 Data Lake. I am now using ASA to stream the live data.
    Spark defaults to uses snappy to compress avro files where as ASA doesn't compress. This has resulted in a 10 fold increase in storage size and a slow down in access speed. Can you add support for compression of the outputs in ASA?

    7 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  20. 7 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  • Don't see your idea?

Stream Analytics

Categories

Feedback and Knowledge Base