Update: Microsoft will be moving away from UserVoice sites on a product-by-product basis throughout the 2021 calendar year. We will leverage 1st party solutions for customer feedback. Learn more here.

Stream Analytics

Azure Stream Analytics is a fully managed real-time stream computation service hosted in Microsoft Azure, which provides highly resilient, low latency, and scalable complex event processing of streaming data. Azure Stream Analytics enables developers to easily combine streams of data with historic records or reference data to derive business insights easily and quickly.

  • Hot ideas
  • Top ideas
  • New ideas
  • My feedback
  1. Early output for windowed grouping

    Aggregates are only emitted after a event is received which falls after this window.

    This means that the current grouping is never outputted if no new data is received.
    It would be nice if there was an option to allow early output of a group when the first event is received which falls within the window. And new events are outputted when new data is received for this group.

    This way I can show the (current) aggregated value for the current day. instead of only the previous day.

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  2. How does one go about getting the client IP address for event data read from an event hub?

    We're sending event data to an event hub but we want to know the IP address of the client that sent the event. How do we get access to this? In particular, we want to see this output from a query in Stream Analytics?

    14 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    3 comments  ·  Flag idea as inappropriate…  ·  Admin →
  3. Make double quotes for string values in a where clause show as an error

    where foo = 'Bar' --works
    where foo = "Bar" --does not work but does not show as an error in the editor

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  4. Allow concatenating properties in AzureTable

    We should be able to build the partition and row keys from composite properties of the stream

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  5. Video Analysis CCTV

    Hello, It would be awesome if Azure had Video Analytics also, so you can connect a IP camera like parkinglots how many parking spaces are free etc. Would be great for parkinghouses, home security, prettymuch anything really.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  6. Allow IoT Hub Feedback Endpoint to be defined as Input

    When configuring an IoT Hub as an ASA input, the IoT Hub endpoint that is used for the data stream is the cloud to device endpoint. Given that the delivery Feedback endpoint /messages/devicebound/feedback is an AMQP endpoint it would be nice to be able to process these messages using ASA. Suggestion is that the Portal UI allows a user to specify either the primary device-to-cloud or the feedback endpoint when creating an IoT Hub input.

    7 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  7. Support datetimeoffset datatype

    I'd like for Stream Analytics query language to support datetimeoffset datatype. This is a must in an IoT world.

    17 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  8. sqlserver

    data input may be a sqlserver. please provide

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  9. output to different Azure Tables based on time

    When you choose Azure Table as output sync, you can set a static table name in the configuration. What about to choose a dynamic name like TableName-YYYY-MM? It could be really useful in scenario when you have to delete old data without occurring in the azure table delete rows pain

    21 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  10. Usean Azure Service Bus Topic as Stream Analytics Input Source

    We have a number of Services which are Ingressed into Azure using Service Bus Topics, the use of topics is due to a the data stream being used in a number of different parallel processing paths, It would be great to be able to use use a topic as an input to a stream analytics job, (currently to achieve the same effect we have to use an API APP web role to receive the data from the topic and then immediately send the message onwards to a separate event hub which is then setup to be an input to a…

    22 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    2 comments  ·  Flag idea as inappropriate…  ·  Admin →
  11. Allow definition of fields outside of stream.

    By definition this data is a stream, adding headers to each row severely limits the applicability, especially in IoT scenarios where device bandwidth may be constrained. We're having to use an intermediary to reformat data before putting it on another event hub just for stream analytics to use it, introducing latency.

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  12. Integrate Stream Analytics operation logs with Application Insight.

    Integrate Stream Analytics operation logs with Azure Application Insight. This makes monitoring and troubleshooting easier.

    6 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  13. Device Command Queue as output

    I would like to have a possibility in ASA job dynamically send output messages. For example, ASA job can determine that some command can be send to some device. If I have 1000-s of devices I have no chance to do this.

    Few suggestions:

    1. Provide IoTHub Command queue as an output
    2. Provide also a possibility to forward messages to subscriptions like commands/device1, where device1 is the id of device.

    In general ASA has a big limitation when dealing with queues (SB).
    ASA queries deal with message payload. By using message payload like DeviceId, we could in a case 2 (see…

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  14. Provide a better experience for updating queries without start/stop

    Currently, if I want to edit my query, I need to stop my stream, wait for it to go from stopping... to stopped, edit query, hit start, waiting for starting... to go to running.

    Instead, I would like to go to my running query, edit my query, like save, and have a prompt, would i like to restart my query NOW, (now and last stop time will be the same in this case), or in the past, and then you go and stop and start query without me having to wait.

    19 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    under review  ·  0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  15. tools for media monitoring

    it would be great to be able to perform media monitoring and the data is collected directly from the source, whether it is a device or social media site.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  16. why cant i get user details via context.user.accountId

    why cant i get user details via context.user.accountId

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  17. 1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  18. Improve DocumentDb output JSON structured formatting

    It appears the DocumentDb formatting from Stream Analytics can't create JSON documents more than one level deep.

    Is there anyway to describe the JSON output, so we can create a document that looks like this? (i.e. where an emitted value is put inside the "payload" object?)

    {
    "resourceId" : "2acabbd1-7f0a-4775-851b-2c0db47689ac",
    "payload" : {
    "cost" : 1,
    }
    }

    5 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  19. Increase late arrival tolerance window

    Right now, this dropdown list allows up to 20 days of tolerance.

    We have a requirement to "catch up" on old events. If for some reason our Event Hub publisher is down, when we start sending old events again, the timestamps may go back to the start of the month.

    It would be great to increase this tolerance level to a full month (31 days), if possible (or longer).

    Is this 20 day limit a restriction of memory buffer used by Stream Analytics, or would this be a simple change in the GUI?

    26 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    4 comments  ·  Flag idea as inappropriate…  ·  Admin →
  20. improve json integration

    I have this:
    row = { "a-x": [ { "b": [ {"c": "here is the stuff i care of"} ]}]}

    So I should be able to do something like:

    ... CROSS APPLY GetArrayElements( row["a-x"][0]["b"][0]["c"]) as WhatIDoCare

    4 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  • Don't see your idea?

Stream Analytics

Categories

Feedback and Knowledge Base