Update: Microsoft will be moving away from UserVoice sites on a product-by-product basis throughout the 2021 calendar year. We will leverage 1st party solutions for customer feedback. Learn more here.

Stream Analytics

Azure Stream Analytics is a fully managed real-time stream computation service hosted in Microsoft Azure, which provides highly resilient, low latency, and scalable complex event processing of streaming data. Azure Stream Analytics enables developers to easily combine streams of data with historic records or reference data to derive business insights easily and quickly.

  • Hot ideas
  • Top ideas
  • New ideas
  • My feedback
  1. Stream Analytics cluster stop/start options

    Request stop/start options for Stream Analytics cluster.
    Currently, Stream Analytics cluster cannot stop even during jobs are not running.
    To reduce cost and enhance convenience, stop/start cluster features should be added.
    Please consider improvement.

    7 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Advanced Analytics  ·  Flag idea as inappropriate…  ·  Admin →
  2. request of the feature which is able to append csv header feature on Azure Stream Analytics

    When csv file without header is inputed to ASA, 1st records is regarded as "header" automatically. it is inconvinient when the user needs to use these kind of file because each 1st value isn't same.
    I want the feature which is able to append and customize the header of csv

    7 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  3. Check data loss happened or not when transient error occurs

    We understand that when transient errors happen, data loss would not happen in Stream analytics since all transient errors are retried regardless of the output error handling policy configuration.

    Although data loss would not happen, we would like to check if data loss happened or not in Stream Analytics.

    Enable diagnostic logs might work for some data conversion error but still could not check if data loss happened due to transient errors.

    7 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  4. Stream data from Blob storage - two seconds limitation leading to missing blobs in output

    As per the current documentation for the job input where this is selected as blob (https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-define-inputs) it requires to be created with 2 seconds apart. This brings draw backs when using Lake2 as input. Please review and advise if this limitation can be addressed and therefore no blobs would be missed

    "In scenarios where many blobs are continuously added and Stream Analytics is processing the blobs as they are added, it's possible for some blobs to be skipped in rare cases due to the granularity of the BlobLastModifiedTime. You can mitigate this by uploading blobs at least two…

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  5. IoTHub can select process data

    Event Hubs can select process data on the Event Hub page.

    This makes it possible to view the data in real time.

    Isn't it possible to do the same with IoT Hub?

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  6. VNET

    Can we have Stream Analytics Cluster for a lesser number of SUs. Since, our Stream Analytics jobs are running on a total of 18 SUs. And we needed VNET support to secure our environment. When we planned to move to cluster, it is for a minimum of 36 SUs and the costs will be doubled just for VNET support.
    It will be really helpful if the minimum limit on number of SUs can be reduced.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  7. Stream Analytics job diagram needs a way to copy and paste the tooltip text

    When I hover a node of the job diagram, I can see some details (error message, SQL query, etc) but I'm unable to move the mouse cursor to select the text, because the tool tip moves to follow the cursor, and disappears when the cursor gets outside the node I was hovering. There should be a way to copy that text to the clipboard.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  8. 1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Query Language  ·  Flag idea as inappropriate…  ·  Admin →
  9. Azure Stream Analytics to handle case sensitive columns (currently getting InputDeserializerError)

    Azure Stream Analytics currently does not support two column with names only differing by case as input.

    //input data
             ""shopperId"":""1234567"",
             ""shopperid"":"""",

    //ASA returns InputDeserializerError error in input

    {"Source":"eventhubinput","Type":"DataError","DataErrorType":"InputDeserializerError.InvalidData","BriefMessage":"Could not deserialize the input event(s) from resource 'Partition: [0], Offset: [12894685888], SequenceNumber: [104569]' as Json. Some possible reasons: 1) Malformed events 2) Input source configured with incorrect serialization format","Message":"Input Message Id: Partition: [0], Offset: [12894685888], SequenceNumber: [104569] Error: Index was out of range. Must be non-negative and less than the size of the collection.\r\nParameter name: index","ExampleEvents

    ASK to Azure Stream Analytics team

    Please consider to change Azure Stream Analytics to…

    83 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  10. Improvements in Reference Input

    Following are my Inputs for Stream Analytics:

    Input 1: Stream Input from IoT Hub
    Input 2: Reference Data from Azure Blob Storage

    My Query:
    SELECT
    din.deviceid as streamdeviceid,
    din.heartrate as streamheartrate,
    refin.deviceid as refdeviceid,
    refin.patientid as refpatientid,
    din.srcdatetime,
    din.EventProcessedUtcTime
    FROM
    [asa-in-iot-healthcare-rpm] din
    TIMESTAMP BY srcdatetime
    Left outer JOIN
    [asa-ref-in-patient-device-mapping] refin
    ON refin.deviceid = din.deviceid

    Giving Error:
    The join predicate is not time bounded. JOIN operation between data streams requires specifying max time distances between matching events. Please add DATEDIFF to the JOIN condition. Example: SELECT input1.a, input2.b FROM input1 JOIN input2 ON DATEDIFF(minute, input1, input2) BETWEEN 0 AND 10

    Modified…

    13 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  11. azure-streamanalytics-cicd test

    The test option should support comparisons of json payloads that contain numbers. Comparisons currently fail for floating-point formats because of rounding I guess.

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  12. Configurable Late Arrival Tolerance for IoT Edge

    This is a critical feature. ASA on IoT Edge is not really usable in production without being able to change the default late arrival tolerance of 5 seconds to a larger value.

    98 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  13. please add feature that could specify file name when output to Azure Blob storage

    at now, Stream Analytics is not support to specify custom file name when output to Azure Blob Storage.
    I hope to specify output file name with timestamp like a "YYYYMMDDMMSS.JSON" when output to Azure Blob Storage.

    18 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Advanced Analytics  ·  Flag idea as inappropriate…  ·  Admin →
  14. Allow synchronous stopping/starting of ASA job via az or powershell

    Waiting for an ASA job to stop/start in order to deploy in a CI/CD scenario is far messier than it needs to be. Having to spin wait and repeatedly query Azure adds unneccessary clutter to devops pipelines.

    Please add a -Wait parameter to powershell, AND --wait to az stream-analytics.

    8 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  15. Allow Microsoft.Web/sites type for private endpoints for the cluster

    Currently in order to use an azure function as output you can only remove the network access restrictions on the function app firewall.

    Would be great if a private endpoint type for azure functions will be supported, so we can privately connect the jobs to the azure functions.

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  16. Increase file size of ASA Parquet file output

    The current settings for Parquet output (min 10k rows, max 2h) generates a huge number of small parquet files (~10MB, written every 5 seconds) if the stream is 1k-50k rows / second. This huge number of small files (17k+ files per day) is very bad for further processing / analysis (e.g. with Spark or Synapse) and currently requires an additional step of file consolidation. Therefore it would be great to either increase the min. number of rows setting or introduce a new min. data size setting (file size might be too complex due to compression, but the data could be…

    11 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  17. Error policy per output

    It should be possible to define error policy per output. Lets say we have 5 different outputs and if one of those five is failing we will never get output to the other routes. Today we will have to create a new streaming job which is costly.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  18. Support input from Synapse Analytics table

    We should be able to directly stream newly added rows from Synapse Analytics tables to other destinations. The current solution I use is via an export to datalake files. But this is not a real time thing...
    Basically if you are able to write to Synapse Analytics, why not be able to read from it ?

    16 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  19. Support Managed Identities when use event hub as input and the job is into a cluster

    Seems that when a job is deployed into a cluster and you set an event hub namespace as input you can only use keys as auth type, would be great to add support for the managed identities

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  20. Reduce impact on memory of reference data

    The documentation currently states: With the current implementation, each join operation with reference data keeps a copy of the reference data in memory, even if you join with the same reference data multiple times.

    This really should not be the case. We are aiming to process in excess of 100k events per second where up to 20% of these events could be held in memory for at least half an hour. These events are also joined to a reference data blob that is ~3MB in size and, at a later point post the long window, again joined to another reference…

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
← Previous 1 3 4 5 13 14
  • Don't see your idea?

Stream Analytics

Categories

Feedback and Knowledge Base