Stream Analytics

Azure Stream Analytics is a fully managed real-time stream computation service hosted in Microsoft Azure, which provides highly resilient, low latency, and scalable complex event processing of streaming data. Azure Stream Analytics enables developers to easily combine streams of data with historic records or reference data to derive business insights easily and quickly.

How can we improve Microsoft Azure Stream Analytics?

You've used all your votes and won't be able to post a new idea, but you can still search and comment on existing ideas.

There are two ways to get more votes:

  • When an admin closes an idea you've voted on, you'll get your votes back from that idea.
  • You can remove your votes from an open idea you support.
  • To see ideas you have already voted on, select the "My feedback" filter and select "My open ideas".
(thinking…)

Enter your idea and we'll search to see if someone has already suggested it.

If a similar idea already exists, you can support and comment on it.

If it doesn't exist, you can post your idea so others can support it.

Enter your idea and we'll search to see if someone has already suggested it.

  • Hot ideas
  • Top ideas
  • New ideas
  • My feedback
  1. R instead of SQL for Azure Stream Analytics queries

    Microsoft R is a must for queries of stream analytics!

    8 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  2. Define SerializerDeserializer for input reference data from blob storage

    Possibility to define (in some language) a Serializer/Deserializer for proprietary input file formats from reference blob data.

    As it is today I had to write a cloud service that parses the data and send it to an eventhub but if stream analytics could accept any file format this way, provided that I somehow define a parser for my special data format, than I could skip that step.

    7 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    under review  ·  0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  3. I'm unable to get my query output saved to an azure sql table

    I am unable to get a SQL DB output to work with my analytics job. My job is storing Application Insights export data into a SQL table. The error I'm receiving is that the SQL initialization connection string is invalid. All of the inputs I'm prompted for are correct, and I'm unable to see what connection string the portal is generating for the analytics job. I need assistance determining why the job cannot store output into my table.

    Error details:

    Error:
    - Format of the initialization string does not conform to specification starting at index 87.

    Message:
    Azure SQL Server…

    7 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  4. Azure Stream Analytics Data Lake Store output file name

    It would be nice to add the ability to provide a file name prefix to the output file for ADLS. If I have multiple ASA jobs pushing data to the same folder, I cannot determine the contents of the file unless I open it. The file name that is generated is of no use.

    7 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  5. running aggregates

    I'd like to be able to sum counts from all records since a fixed time in the past, rather than use fixed windows.

    This would be like a sliding window, but with a fixed beginning, and an end that follows the last sample.

    Input stream
    sesID TIME count
    session1 time 1
    session2 time 2
    session2 time 3

    The first step would be to create an subquery that uses LAG to find where the sesID changes, and then store that time.

    That time then could be used as the start of the window, let's call it fixedStartWindow(StartTime, last(time)) and allow group…

    7 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  6. Allow IoT Hub Feedback Endpoint to be defined as Input

    When configuring an IoT Hub as an ASA input, the IoT Hub endpoint that is used for the data stream is the cloud to device endpoint. Given that the delivery Feedback endpoint /messages/devicebound/feedback is an AMQP endpoint it would be nice to be able to process these messages using ASA. Suggestion is that the Portal UI allows a user to specify either the primary device-to-cloud *or* the feedback endpoint when creating an IoT Hub input.

    7 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  7. Search array elements

    Ability to search through array elements.

    My json data is as follows into the eventhub

    {
    "customerid":"ifslabs",
    "deviceid":"car1",
    "starttime":"2015-08-21 08:19:40.362983",
    "eventtime":"2015-08-21 08:19:55.045514",
    "SensorData": [
    {"SensorId":"gps", "SensorType":"latitude", "SensorValue":"1.1"},
    {"SensorId":"gps","SensorType":"longitude","SensorValue":"9.9"},
    {"SensorId":"Fuel System Stat","SensorType":"","SensorValue":"NORESPONSE"},
    {"SensorId":"Calc Load Value","SensorType":"","SensorValue":"NORESPONSE"},
    {"SensorId":"Coolant Temp","SensorType":"F","SensorValue":"NORESPONSE"},
    {"SensorId":"Engine RPM","SensorType":"","SensorValue":"NORESPONSE"},
    {"SensorId":"Vehicle Speed","SensorType":"MPH","SensorValue":"NORESPONSE"},
    {"SensorId":"Intake Air Temp","SensorType":"F","SensorValue":"NORESPONSE"}
    ]
    }

    Using stream analytics function GetArrayElement I can get array elements of SensorData flattend but I want to search through the array and filter only some specific SensorType for example:

    select inputeventhub.starttime as starttime,
    inputeventhub.eventtime as eventtime,
    FilterArrayElement(inputeventhub.SensorData, SensorType, 'latitude') as latitude,
    FilterArrayElement(inputeventhub.SensorData, SensorType, 'longitude') as longitude
    outputcarlocation from inputeventhub

    Using a new function like FilterArrayElement(array,…

    7 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    under review  ·  0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  8. Define a specific partition of a n eventhub as input to stream analytics query

    Define a specific partition of a n eventhub as input to stream analytics query

    7 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    under review  ·  0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  9. Add Query Parameters to the SA job configuration settings.

    Create query parameters (stored in configuration settings) for the SA job so that common values can be controlled without modifying the query itself.

    6 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    2 comments  ·  Flag idea as inappropriate…  ·  Admin →
  10. Obfuscation or encryption of query code

    Will it be possible in a future release of Stream Analytics to either obfuscate or encrypt the query (or javascript or C#) code of SA jobs. We are looking into the possibility to implement our SA jobs/models, which we use for predictive maintenance for instance, besides our own subscriptions, in subscriptions of customers as well. But we would not want them to be able to see the code of our queries or, because these contain proprietary logic, rules or definitions.

    6 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  11. Ability to identify reference data from data in the input

    Ability to identify reference data from data in the input, something like a SELECT DISTINCT in SQL

    6 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    under review  ·  1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  12. Provide API App DataSource

    Output of some API App can be used as input data stream in ASA job.

    6 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    under review  ·  0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  13. 6 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  14. 6 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  15. Allow acces to AMQP message application properties and properties

    Currently only payload of stream messages is queried and can be used for filtering and analysis purposes, AMQP (and any other messaging protocol) defines some ways to "annotate" message payloads with various headers and properties - this is mostly the place where message routing information should be placed and therefore Stream Analytics should be able to access it.
    This would also allow serialized/compressed payload to be processed by Stream Analytics, not only JSON, Avro and CSV formats.

    6 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    3 comments  ·  Flag idea as inappropriate…  ·  Admin →
  16. ASA to partitioned documentdb

    Is the recent update to ASA to output to a *partitioned* DocumentDB collection now generally available and production ready?

    6 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  17. Delete blob inputs in blob input stream

    If I do have a blob input stream, I do want my file to be deleted after it is processed in the stream analytics job. It's a stream, I do not need all my compressed original data. I do want to be able to enable this when I create a new Blob input stream.

    6 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  18. True "pay by use" pricing for Power-BI Embedded (with stream analytics)

    Azure has become so popular partly due to the fair "pay by use" policy that it claims as one of its principles. Where is this with Power-BI Embedded (with stream analytics). I can't even get started with a real product for under 600 $/month? What if i need to start with 50 customers? In this case, I have to use something other than Power-BI, and it will be a pain, at least, to move to Power-BI once I am up to 1000 customers or so... Please stick to the Azure principles and make the "pay to play" game reasonable for…

    6 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  19. New output sink LogicApp would be nice...

    To be flexible to do what ever we like to do... (alerting via Mail, SMS ...)

    6 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  20. How does one go about getting the client IP address for event data read from an event hub?

    We're sending event data to an event hub but we want to know the IP address of the client that sent the event. How do we get access to this? In particular, we want to see this output from a query in Stream Analytics?

    6 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  • Don't see your idea?

Stream Analytics

Feedback and Knowledge Base