Stream Analytics

Azure Stream Analytics is a fully managed real-time stream computation service hosted in Microsoft Azure, which provides highly resilient, low latency, and scalable complex event processing of streaming data. Azure Stream Analytics enables developers to easily combine streams of data with historic records or reference data to derive business insights easily and quickly.

  • Hot ideas
  • Top ideas
  • New ideas
  • My feedback
  1. Please restore "Output Last Processed Time" metric at job diagram

    I have check "Output Last Processed Time" everyday at job diagram.
    So, I want to be restored "Output Last Processed Time" metric at job diagram.

    10 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →

    Hi, thanks for your comments.
    We are making some changes to the job diagram, and unfortunately had to remove this information for now. We’re investigating how to add it back.
    What is your usage of Output Last Processed Time? If it’s to monitor if your streaming pipelines is acting as expected, you may want to use the “output watermak delay”. When this number goes up, it means your pipeline is getting delayed. More info here: https://azure.microsoft.com/en-us/blog/new-metric-in-azure-stream-analytics-tracks-latency-of-your-streaming-pipeline/
    Thanks,
    JS

  2. Fix the Ci /CD NuGet package so that it works in VSTS

    Please fix the CI / CD NuGet package. It currently breaks in VSTS failing to load the Newtonsoft.Json package.

    10 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    3 comments  ·  Flag idea as inappropriate…  ·  Admin →
  3. Azure Stream Analytics Data Lake Store output file name

    It would be nice to add the ability to provide a file name prefix to the output file for ADLS. If I have multiple ASA jobs pushing data to the same folder, I cannot determine the contents of the file unless I open it. The file name that is generated is of no use.

    9 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  4. Allow Creation of Azure Storage outputs with Shared Access Signatures

    Currently creating an output to an Azure Storage account requires the account name and key.

    It would be really useful to be able to create those outputs using a SAS URL/Token to avoid unneeded sharing of keys either when they are recycled or when creating a new Stream Analytics output.

    9 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  5. High Resource Error Message

    During job execution for message on high resource utilization, the level shouldn't be "Informational" but Critical since it will lead to a delay or failed execution

    9 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  6. Input from Cosmos DB changefeed

    Support direct input from Cosmos DB changefeed.

    9 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  7. Convert function is missing for Hex values.

    Convert function is missing for Hex values. I want to do like... Convert(INT,Convert(VARBINARY,SUBSTRING(payload,3,2),2)) as Humidity.
    Also, Aggregate function should work for case like.... AVG (Humidity) AS [AverageHumidity]

    9 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  8. Ability to use more than one source for reference data

    It would be nice to be able to use "reference" data from more than one source. I've aware that the possibility of using SQL table and Storage table is under consideration but this is different.

    Usage scenario for an IoT application: you have inbound sensor data that you need to process. You have reference data (such as thresholds that are different for each sensor) and you have customer configurations that you need to check against the incoming data.
    Some of that data might be stored in a SQL table and other on Storage or is updated in Blob storage.
    In…

    9 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  9. Scaling up Streaming units should be transparant and not require the job to be stopped

    Currently adjusting the streaming units requires the job to be stopped within the portal. It would be very beneficial if the scale could be adjusted on the fly and not require a job stop/start.

    8 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  10. Input from Azure Data Lake Storage

    Data Lake is not added as output which is great. But it should be added as input as well. Eg. we continuously save data to Data Lake Storage, and want to stream these to Power BI.

    8 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    2 comments  ·  Flag idea as inappropriate…  ·  Admin →
  11. R instead of SQL for Azure Stream Analytics queries

    Microsoft R is a must for queries of stream analytics!

    8 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  12. Automated Query Testing

    Testing ASA queries requires manual interaction through the portal as described in https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-test-query

    It would be nice to be able to create automated tests.

    ie: an API to pass test input that returns output.

    8 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  13. Define SerializerDeserializer for input reference data from blob storage

    Possibility to define (in some language) a Serializer/Deserializer for proprietary input file formats from reference blob data.

    As it is today I had to write a cloud service that parses the data and send it to an eventhub but if stream analytics could accept any file format this way, provided that I somehow define a parser for my special data format, than I could skip that step.

    7 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    under review  ·  0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  14. date

    Enable dates to be provided in local date time rather than UTC. Useful for outputs to PowerBI to create reports that reflect the local time the data was captured and not UTC

    7 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  15. running aggregates

    I'd like to be able to sum counts from all records since a fixed time in the past, rather than use fixed windows.

    This would be like a sliding window, but with a fixed beginning, and an end that follows the last sample.

    Input stream

    sesID      TIME    count
    

    session1 time 1
    session2 time 2
    session2 time 3

    The first step would be to create an subquery that uses LAG to find where the sesID changes, and then store that time.

    That time then could be used as the start of the window, let's call it fixedStartWindow(StartTime, last(time)) and allow group…

    7 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  16. Allow IoT Hub Feedback Endpoint to be defined as Input

    When configuring an IoT Hub as an ASA input, the IoT Hub endpoint that is used for the data stream is the cloud to device endpoint. Given that the delivery Feedback endpoint /messages/devicebound/feedback is an AMQP endpoint it would be nice to be able to process these messages using ASA. Suggestion is that the Portal UI allows a user to specify either the primary device-to-cloud or the feedback endpoint when creating an IoT Hub input.

    7 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  17. 7 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  18. Way better feedback is needed when there is a problem

    I have an issue right now. no matter what I do the job starts and goes immediately IDLE. There is ZERO feedback anywhere. The query passes, all connection tests pass. The logs show there are zero issues but its running.. however it doesn't DO anything at all.

    7 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    planned  ·  0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  19. Search array elements

    Ability to search through array elements.

    My json data is as follows into the eventhub

    {

    "customerid":"ifslabs",
    
    "deviceid":"car1",
    "starttime":"2015-08-21 08:19:40.362983",
    "eventtime":"2015-08-21 08:19:55.045514",
    "SensorData": [
    {"SensorId":"gps", "SensorType":"latitude", "SensorValue":"1.1"},
    {"SensorId":"gps","SensorType":"longitude","SensorValue":"9.9"},
    {"SensorId":"Fuel System Stat","SensorType":"","SensorValue":"NORESPONSE"},
    {"SensorId":"Calc Load Value","SensorType":"","SensorValue":"NORESPONSE"},
    {"SensorId":"Coolant Temp","SensorType":"F","SensorValue":"NORESPONSE"},
    {"SensorId":"Engine RPM","SensorType":"","SensorValue":"NORESPONSE"},
    {"SensorId":"Vehicle Speed","SensorType":"MPH","SensorValue":"NORESPONSE"},
    {"SensorId":"Intake Air Temp","SensorType":"F","SensorValue":"NORESPONSE"}
    ]

    }

    Using stream analytics function GetArrayElement I can get array elements of SensorData flattend but I want to search through the array and filter only some specific SensorType for example:

    select inputeventhub.starttime as starttime,

       inputeventhub.eventtime as eventtime,
    
    FilterArrayElement(inputeventhub.SensorData, SensorType, 'latitude') as latitude,
    FilterArrayElement(inputeventhub.SensorData, SensorType, 'longitude') as longitude

    outputcarlocation from inputeventhub

    Using a new function like FilterArrayElement(array,…

    7 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    under review  ·  0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  20. Define a specific partition of a n eventhub as input to stream analytics query

    Define a specific partition of a n eventhub as input to stream analytics query

    7 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    under review  ·  0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  • Don't see your idea?

Stream Analytics

Categories

Feedback and Knowledge Base