Stream Analytics

Azure Stream Analytics is a fully managed real-time stream computation service hosted in Microsoft Azure, which provides highly resilient, low latency, and scalable complex event processing of streaming data. Azure Stream Analytics enables developers to easily combine streams of data with historic records or reference data to derive business insights easily and quickly.

  • Hot ideas
  • Top ideas
  • New ideas
  • My feedback
  1. There should be Boolean support between Azure Stream Analytics and the Machine Learning Services

    The Stream Analytic Job doesn't support Boolean variables as input (they are treated as strings), I had to transform them into viable inputs in the ML Studio, its not a breakable problem but it took me a while to understand that my machine learning service was expecting a Boolean and that the stream job wasn't able to provide it.

    4 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  2. improve json integration

    I have this:
    row = { "a-x": [ { "b": [ {"c": "here is the stuff i care of"} ]}]}

    So I should be able to do something like:

    ... CROSS APPLY GetArrayElements( row["a-x"][0]["b"][0]["c"]) as WhatIDoCare

    4 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  3. Log changes to scale units in Operation Logs

    When someone changes SUs of an ASA job, the operation should show up in the Operations forms.

    Something along the lines of:
    "Job X was changed from 1 to 2 scale units at time Y"

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    under review  ·  0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  4. Give a decent error message instead of "Stream Analytics job has validation errors: The given key was not present in the dictionary."

    Any time there is some kind of dictionary, you need to tell us WHICH dictionary and WHAT they key was. Otherwise, the error is completely useless. You might as well say, "Something went wrong".

    I've gone over all of my field names from input, query and output. They all match. So far, ASA has been pretty good with errors but this is terrible.

    4 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  5. Add ability to route offending messages to storage

    In addition to drop and retry, it would be good to have the ability to route failed events to blob storage. Also, there seems no way to correlate an error to the message(s) which caused the error.

    4 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  6. Add iot hub device name to Stream Analytics Query Language

    From https://msdn.microsoft.com/en-us/library/azure/mt573293.aspx: "In Azure Stream Analytics, all data stream events have a timestamp associated with them."

    But if I look at messages Sent to the IotHub (as input for Stream Analytics):

    06/22/16 17:39:13> Device: [TestOne], Data:[{"devicename":"TestOne","time":"2016-06-22T17:39:02.1517322+02:00","temperature":22.0,"lumen":512}]

    Also the device name is available.

    But that name is not accessible in Stream Analytics. As shown above, I have to put the name of the device inside my telemetry...

    Is it possible to make the device name available in the Stream Analytics Query Language by default?

    4 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  7. Autoformat Query in Query Window

    Have a button that will auto format the query in the query window like in an IDE.

    4 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    under review  ·  0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  8. “Output contains multiple rows …” warning

    I'm not totally sure I'm right here; please let me know if this should go anywhere else. I posted a question on Stackoverflow yesterday (https://stackoverflow.com/questions/45979008/azure-stream-analytics-output-contains-multiple-rows-warning) and was redirected here.

    We're using a Stream Analytics component in Azure to send data (log messages from different web apps) to a table storage account. The messages are retrieved from an Event Hub, but I think this doesn't matter here.

    Within the Stream Analytics component we defined an output for the table storage account including partition and row key settings. As of now the partition key will be the name of the…

    4 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    2 comments  ·  Flag idea as inappropriate…  ·  Admin →
  9. Need to have output back to IoT Hub

    Need to have output to IoT Hub

    4 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  10. Please add function that change time zone of activityLog.

    The JSON in activityLog shows time zone as a UTC.
    But I want to set that time zone as a JST.
    So, please add function that can change time zone of activityLog.

    4 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Advanced Analytics  ·  Flag idea as inappropriate…  ·  Admin →
  11. some form of error handling in cast

    if cast to float/bigint does not succeed I want to know it somehow in my code and act on it.

    Right now it fails all the query.
    Some kind if TRYCAST operator with some default value for example

    4 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    under review  ·  0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  12. no event for reference

    Hi,

    I look for to count event appear in period of time for a device reference list, using :
    SELECT

    t2.id,
    
    count(t1.[TimeStamp]) as nbEvents

    FirstTime
    FROM

    [IotHubInput] t1 TIMESTAMP BY [TimeStamp] 
    
    join [DevicesList] t2 on t2.id = t1.id

    group by

    t2.id,
    
    TumblingWindow(hour,5)

    Give me only count for present device. I would like to have result 0 for device present in reference.
    Can't put deviceList in first or join option to have all lines from t2 (like in SQLServer)

    4 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  13. My streaming analytics job says FAILED

    ... and i am able to start it again.... But i want to know why it failed, so it won't happen again in the future. It was down for half a day or so, because i didn't notice it. I can't find anything information as to why it failed, not on the new nor old portal

    4 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    3 comments  ·  Flag idea as inappropriate…  ·  Admin →
  14. Job Diagram

    This Job Diagram holds real promise as visual intuitive view of job execution and metrics especially during development. In it current form it's anything but that. When I select an output it seems to default to out of order events only.

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  15. Integration with deployed ML.NET models

    Integration with deployed ML.NET models

    It would be very neat and useful if I can deploy my custom ML.NET model to an Azure Function or elsewhere, and directly call it from Azure Stream Analytics. As nice as Azure Machine Learning Service is, it forces me to learn the Python eco-system.

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  16. Failed to Get Framework Assemblies Local Path During Pushing Edge Package??

    I am getting the above error while deploying ASA at edge. Any idea?

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  17. Compression types as multiple option

    Right now, we can set the compression type in asa job as single option. I mean, we can tell asa job to treat all events as gzip, deflate, or none. In real scenario, the events input may be in various formats.
    It would be great to have this settings as multiple options.
    Right now to achieve this you have to create another asa job with different settings, and then both of them have a lot of warnings with wrong deserialization.

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  18. Allow definition of fields outside of stream.

    By definition this data is a stream, adding headers to each row severely limits the applicability, especially in IoT scenarios where device bandwidth may be constrained. We're having to use an intermediary to reformat data before putting it on another event hub just for stream analytics to use it, introducing latency.

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  19. Issue in renew-authorisation for stream analytics output - for datalake store

    I am trying to set up (deploy) a stream analytics job with data lake store as output using terraform. Terraform internally call the Azure API (through Azure Go SDK) for stream analytics to set the datalake store output that expects a refresh token.

    As per the API documentation, it is recommended to put a dummy string value here when creating the data source and then going to the Azure Portal to authenticate the data source which will update this property with a valid refresh token. Hence, I passed a "dummy" string as refresh token, and I could see the datalake…

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  20. Allow Functions like CAST after TIMESTAMP BY

    My incoming timestamps have to be converted to ISO 8601 but only columns (and no functions) are allowed after the TIMESTAMP BY statement.
    Please allow all kind of functions in order to provide a conversion. I think some guys here needed also a unix datetime converter :)

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  • Don't see your idea?

Stream Analytics

Categories

Feedback and Knowledge Base