Update: Microsoft will be moving away from UserVoice sites on a product-by-product basis throughout the 2021 calendar year. We will leverage 1st party solutions for customer feedback. Learn more here.

Stream Analytics

Azure Stream Analytics is a fully managed real-time stream computation service hosted in Microsoft Azure, which provides highly resilient, low latency, and scalable complex event processing of streaming data. Azure Stream Analytics enables developers to easily combine streams of data with historic records or reference data to derive business insights easily and quickly.

  • Hot ideas
  • Top ideas
  • New ideas
  • My feedback
  1. Stream Analytics cluster stop/start options

    Request stop/start options for Stream Analytics cluster.
    Currently, Stream Analytics cluster cannot stop even during jobs are not running.
    To reduce cost and enhance convenience, stop/start cluster features should be added.
    Please consider improvement.

    7 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Advanced Analytics  ·  Flag idea as inappropriate…  ·  Admin →
  2. Stream data from Blob storage - two seconds limitation leading to missing blobs in output

    As per the current documentation for the job input where this is selected as blob (https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-define-inputs) it requires to be created with 2 seconds apart. This brings draw backs when using Lake2 as input. Please review and advise if this limitation can be addressed and therefore no blobs would be missed

    "In scenarios where many blobs are continuously added and Stream Analytics is processing the blobs as they are added, it's possible for some blobs to be skipped in rare cases due to the granularity of the BlobLastModifiedTime. You can mitigate this by uploading blobs at least two…

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  3. IoTHub can select process data

    Event Hubs can select process data on the Event Hub page.

    This makes it possible to view the data in real time.

    Isn't it possible to do the same with IoT Hub?

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  4. VNET

    Can we have Stream Analytics Cluster for a lesser number of SUs. Since, our Stream Analytics jobs are running on a total of 18 SUs. And we needed VNET support to secure our environment. When we planned to move to cluster, it is for a minimum of 36 SUs and the costs will be doubled just for VNET support.
    It will be really helpful if the minimum limit on number of SUs can be reduced.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  5. request of the feature which is able to append csv header feature on Azure Stream Analytics

    When csv file without header is inputed to ASA, 1st records is regarded as "header" automatically. it is inconvinient when the user needs to use these kind of file because each 1st value isn't same.
    I want the feature which is able to append and customize the header of csv

    7 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  6. Stream Analytics job diagram needs a way to copy and paste the tooltip text

    When I hover a node of the job diagram, I can see some details (error message, SQL query, etc) but I'm unable to move the mouse cursor to select the text, because the tool tip moves to follow the cursor, and disappears when the cursor gets outside the node I was hovering. There should be a way to copy that text to the clipboard.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  7. 1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Query Language  ·  Flag idea as inappropriate…  ·  Admin →
  8. Check data loss happened or not when transient error occurs

    We understand that when transient errors happen, data loss would not happen in Stream analytics since all transient errors are retried regardless of the output error handling policy configuration.

    Although data loss would not happen, we would like to check if data loss happened or not in Stream Analytics.

    Enable diagnostic logs might work for some data conversion error but still could not check if data loss happened due to transient errors.

    7 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  9. 1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  10. azure-streamanalytics-cicd test

    The test option should support comparisons of json payloads that contain numbers. Comparisons currently fail for floating-point formats because of rounding I guess.

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  11. Error policy per output

    It should be possible to define error policy per output. Lets say we have 5 different outputs and if one of those five is failing we will never get output to the other routes. Today we will have to create a new streaming job which is costly.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  12. Allow Microsoft.Web/sites type for private endpoints for the cluster

    Currently in order to use an azure function as output you can only remove the network access restrictions on the function app firewall.

    Would be great if a private endpoint type for azure functions will be supported, so we can privately connect the jobs to the azure functions.

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  13. Support Managed Identities when use event hub as input and the job is into a cluster

    Seems that when a job is deployed into a cluster and you set an event hub namespace as input you can only use keys as auth type, would be great to add support for the managed identities

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  14. Reduce impact on memory of reference data

    The documentation currently states: With the current implementation, each join operation with reference data keeps a copy of the reference data in memory, even if you join with the same reference data multiple times.

    This really should not be the case. We are aiming to process in excess of 100k events per second where up to 20% of these events could be held in memory for at least half an hour. These events are also joined to a reference data blob that is ~3MB in size and, at a later point post the long window, again joined to another reference…

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  15. Provide ability to Write message to blob store and send blob path to queue

    This would allow me to save large results that may not fit in a queue, but still use Azure function to do further processing on results.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  16. Allow synchronous stopping/starting of ASA job via az or powershell

    Waiting for an ASA job to stop/start in order to deploy in a CI/CD scenario is far messier than it needs to be. Having to spin wait and repeatedly query Azure adds unneccessary clutter to devops pipelines.

    Please add a -Wait parameter to powershell, AND --wait to az stream-analytics.

    8 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  17. Improvements in Reference Input

    Following are my Inputs for Stream Analytics:

    Input 1: Stream Input from IoT Hub
    Input 2: Reference Data from Azure Blob Storage

    My Query:
    SELECT
    din.deviceid as streamdeviceid,
    din.heartrate as streamheartrate,
    refin.deviceid as refdeviceid,
    refin.patientid as refpatientid,
    din.srcdatetime,
    din.EventProcessedUtcTime
    FROM
    [asa-in-iot-healthcare-rpm] din
    TIMESTAMP BY srcdatetime
    Left outer JOIN
    [asa-ref-in-patient-device-mapping] refin
    ON refin.deviceid = din.deviceid

    Giving Error:
    The join predicate is not time bounded. JOIN operation between data streams requires specifying max time distances between matching events. Please add DATEDIFF to the JOIN condition. Example: SELECT input1.a, input2.b FROM input1 JOIN input2 ON DATEDIFF(minute, input1, input2) BETWEEN 0 AND 10

    Modified…

    13 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  18. Display scale from main screen

    Minor UI Enhancement

    Would be nice to see what a stream analytics job current scale is without having to go into the sub screen.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  19. Which input caused the issue

    ASA ran into runtime error and we want to know which input/field caused this problem, so that we can fix the source to make sure these values arent emitted.

    The input deserialization errors are already captured as part of the logs and only runtime errors are not captured as part of the logs (e.g cast errors).

    6 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  20. Add KQL support

    KQL is such a powerfull language for working with time series data. In my opinion it is a very clean and transparent language for defining queries. It would be great to have KQL-like support for Stream Analytics on the edge and in the cloud.

    9 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Query Language  ·  Flag idea as inappropriate…  ·  Admin →
← Previous 1 3 4 5 13 14
  • Don't see your idea?

Stream Analytics

Categories

Feedback and Knowledge Base