Stream Analytics

Azure Stream Analytics is a fully managed real-time stream computation service hosted in Microsoft Azure, which provides highly resilient, low latency, and scalable complex event processing of streaming data. Azure Stream Analytics enables developers to easily combine streams of data with historic records or reference data to derive business insights easily and quickly.

  • Hot ideas
  • Top ideas
  • New ideas
  • My feedback
  1. Allow datatype precision

    I'd like a Stream Analytics Job to be able to insert events into a SQL-table and be able to use a NVARCHAR[n] column instead of NVARCHAR(MAX). This would be so these columns could be used in an index.

    Also, mapping a SA 'FLOAT' to a SQL 'DECIMAL(p,s)' should be possible.

    Ideally, I'd like to be able to set these data type precisions in a CREATE TABLE statement of a SA-query. Any event with a defnied property that exceeds the defined precision/length should be reduced to fit. F.e.;

    event:
    { "ArticleNumber" : "ABCDEFGH12345678" }

    CREATE TABLE article
    ( ArticleNumber NVARCHAR(12)
    );

    5 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    under review  ·  0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  2. Allow union operator with a wildcard

    I do want to merge two input streams to one, as of I have one stream with all my data. For this I want to do:
    SELECT * FROM FirstStream UNION SELECT * FROM Secondstream.
    I do not know all the column names in my case, so I want to do this dynamically.

    4 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  3. There should be Boolean support between Azure Stream Analytics and the Machine Learning Services

    The Stream Analytic Job doesn't support Boolean variables as input (they are treated as strings), I had to transform them into viable inputs in the ML Studio, its not a breakable problem but it took me a while to understand that my machine learning service was expecting a Boolean and that the stream job wasn't able to provide it.

    4 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  4. Azure Key Vault as reference data input

    I need to apply some values to my stream analytics query that are considered "secret" by our customer. So it would be great if I don't have to store them in plaintext but could just input them from Azure Key Vault into my query.

    4 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  5. Please make it faster to restart, pathetically slow

    Please make it faster to restart, pathetically slow

    4 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  6. Use TIMESTAMP BY clause to define the {date}/{time} tokens in the output sink to datalake

    I'm not sure if it does this already and I'm doing something wrong. I need to ingest historical data but I want it to go into the {date}/{time} folder structure in my data lake. I need to use the datetime column in the message rather than the enqued time or processed time because it is historical data from a few years back. I would have thought it would use the TIMESTAMP BY column to do it but evidently not.

    4 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  7. improve json integration

    I have this:
    row = { "a-x": [ { "b": [ {"c": "here is the stuff i care of"} ]}]}

    So I should be able to do something like:

    ... CROSS APPLY GetArrayElements( row["a-x"][0]["b"][0]["c"]) as WhatIDoCare

    4 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  8. Log changes to scale units in Operation Logs

    When someone changes SUs of an ASA job, the operation should show up in the Operations forms.

    Something along the lines of:
    "Job X was changed from 1 to 2 scale units at time Y"

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    under review  ·  0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  9. Give a decent error message instead of "Stream Analytics job has validation errors: The given key was not present in the dictionary."

    Any time there is some kind of dictionary, you need to tell us WHICH dictionary and WHAT they key was. Otherwise, the error is completely useless. You might as well say, "Something went wrong".

    I've gone over all of my field names from input, query and output. They all match. So far, ASA has been pretty good with errors but this is terrible.

    4 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  10. Add ability to route offending messages to storage

    In addition to drop and retry, it would be good to have the ability to route failed events to blob storage. Also, there seems no way to correlate an error to the message(s) which caused the error.

    4 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  11. Add support for compressed output

    I have historic data imported through spark in avro format to our Gen 1 Data Lake. I am now using ASA to stream the live data.
    Spark defaults to uses snappy to compress avro files where as ASA doesn't compress. This has resulted in a 10 fold increase in storage size and a slow down in access speed. Can you add support for compression of the outputs in ASA?

    4 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  12. Add iot hub device name to Stream Analytics Query Language

    From https://msdn.microsoft.com/en-us/library/azure/mt573293.aspx: "In Azure Stream Analytics, all data stream events have a timestamp associated with them."

    But if I look at messages Sent to the IotHub (as input for Stream Analytics):

    06/22/16 17:39:13> Device: [TestOne], Data:[{"devicename":"TestOne","time":"2016-06-22T17:39:02.1517322+02:00","temperature":22.0,"lumen":512}]

    Also the device name is available.

    But that name is not accessible in Stream Analytics. As shown above, I have to put the name of the device inside my telemetry...

    Is it possible to make the device name available in the Stream Analytics Query Language by default?

    4 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  13. Allow Azure Data Explorer (ADX) as a ouput

    We use Stream Analytics for alerting in combination with Azure functions and Azure Data Explorer is our central data store. Life would be easier, if we can output our alerts directly to Azure Data Explorer (ADX). This would also safe money, as for now we are putting an additional EventHub in place to mitigate the issue.

    4 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  14. Autoformat Query in Query Window

    Have a button that will auto format the query in the query window like in an IDE.

    4 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    under review  ·  0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  15. Rename Existing Stream Analytics Job

    Would like to be able to rename an existing stream analytics job. Without this, we have to delete existing and create a new one, copying everything over, and then we lose the ability to start "from last stopped"

    Obvious work around for this is making a note on when the job was stopped, and starting the new one from that time.

    But ideally, changing the name would be much simpler and less prone to error.

    4 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  16. “Output contains multiple rows …” warning

    I'm not totally sure I'm right here; please let me know if this should go anywhere else. I posted a question on Stackoverflow yesterday (https://stackoverflow.com/questions/45979008/azure-stream-analytics-output-contains-multiple-rows-warning) and was redirected here.

    We're using a Stream Analytics component in Azure to send data (log messages from different web apps) to a table storage account. The messages are retrieved from an Event Hub, but I think this doesn't matter here.

    Within the Stream Analytics component we defined an output for the table storage account including partition and row key settings. As of now the partition key will be the name of the…

    4 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    2 comments  ·  Flag idea as inappropriate…  ·  Admin →
  17. Need to have output back to IoT Hub

    Need to have output to IoT Hub

    4 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  18. some form of error handling in cast

    if cast to float/bigint does not succeed I want to know it somehow in my code and act on it.

    Right now it fails all the query.
    Some kind if TRYCAST operator with some default value for example

    4 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    under review  ·  0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  19. no event for reference

    Hi,

    I look for to count event appear in period of time for a device reference list, using :
    SELECT

    t2.id,
    
    count(t1.[TimeStamp]) as nbEvents

    FirstTime
    FROM

    [IotHubInput] t1 TIMESTAMP BY [TimeStamp] 
    
    join [DevicesList] t2 on t2.id = t1.id

    group by

    t2.id,
    
    TumblingWindow(hour,5)

    Give me only count for present device. I would like to have result 0 for device present in reference.
    Can't put deviceList in first or join option to have all lines from t2 (like in SQLServer)

    4 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  20. My streaming analytics job says FAILED

    ... and i am able to start it again.... But i want to know why it failed, so it won't happen again in the future. It was down for half a day or so, because i didn't notice it. I can't find anything information as to why it failed, not on the new nor old portal

    4 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    3 comments  ·  Flag idea as inappropriate…  ·  Admin →
1 2 5 7 9 12 13
  • Don't see your idea?

Stream Analytics

Categories

Feedback and Knowledge Base