Update: Microsoft will be moving away from UserVoice sites on a product-by-product basis throughout the 2021 calendar year. We will leverage 1st party solutions for customer feedback. Learn more here.

Stream Analytics

Azure Stream Analytics is a fully managed real-time stream computation service hosted in Microsoft Azure, which provides highly resilient, low latency, and scalable complex event processing of streaming data. Azure Stream Analytics enables developers to easily combine streams of data with historic records or reference data to derive business insights easily and quickly.

  • Hot ideas
  • Top ideas
  • New ideas
  • My feedback
  1. Add ALWAYS TRUE pattern to MATCH_RECOGNIZE

    using matchrecognize to detect "V" shape data change is very convenient, however it will miss the first value as "delta" calculation starts only when data is changed.
    you can find a good explanation of ALWAYS TRUE in MATCH
    RECOGNIZE at https://www.oracle.com/technetwork/database/bi-datawarehousing/mr-deep-dive-3769287.pdf

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Query Language  ·  Flag idea as inappropriate…  ·  Admin →
  2. Increase file size of ASA Parquet file output

    The current settings for Parquet output (min 10k rows, max 2h) generates a huge number of small parquet files (~10MB, written every 5 seconds) if the stream is 1k-50k rows / second. This huge number of small files (17k+ files per day) is very bad for further processing / analysis (e.g. with Spark or Synapse) and currently requires an additional step of file consolidation. Therefore it would be great to either increase the min. number of rows setting or introduce a new min. data size setting (file size might be too complex due to compression, but the data could be…

    9 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  3. 3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  4. please add feature that could specify file name when output to Azure Blob storage

    at now, Stream Analytics is not support to specify custom file name when output to Azure Blob Storage.
    I hope to specify output file name with timestamp like a "YYYYMMDDMMSS.JSON" when output to Azure Blob Storage.

    11 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Advanced Analytics  ·  Flag idea as inappropriate…  ·  Admin →
  5. please add feature that could specify file name when output to Azure Blob storage

    at now, Stream Analytics is not support to specify custom file name when output to Azure Blob Storage.
    I hope to specify output file name with timestamp like a "YYYYMMDDMMSS.JSON" when output to Azure Blob Storage.

    0 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Advanced Analytics  ·  Flag idea as inappropriate…  ·  Admin →
  6. Read Application Properties (Device Properties) from Iot Hub event endpoint in StreamAnalytic

    I want store telemetry in azure sql db.so I got input from Iot Hub built-in event endpoint in stream analytics (Integrated in sql db) and want to process application properpties that attached to telemetry from iot hub message enrichemnt. I research alot in UDF, UDA, built-in function stream analytics but couldnt even see application properties in stream analytic. So how can i do that? Is this possible?

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  7. Add Azure Service Bus Queues as Input

    Add Azure Service Bus Queues as Input for Azure Stream Analytics. It would allow us to consider Azure Service Bus more often as a viable solution instead of only Event Hubs.

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  8. Support input from Synapse Analytics table

    We should be able to directly stream newly added rows from Synapse Analytics tables to other destinations. The current solution I use is via an export to datalake files. But this is not a real time thing...
    Basically if you are able to write to Synapse Analytics, why not be able to read from it ?

    16 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  9. Allow ASA to acquire bearer token for authenticating itself to Azure Function protected by App Service Authentication

    A possible output sink for an ASA job is Azure Functions. However, if the Function is protected by App Service Authentication (or any type of OAuth authentication), the flow will not work since ASA doesn't have the functionality to fetch bearer tokens.

    6 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Security  ·  Flag idea as inappropriate…  ·  Admin →
  10. Support Database Schema for SQL Database Sink

    First Error 'could not find table [schema.table]'
    Updated Table to schema].[table and new error

    I would like one security user for Streaming Analytics to be able to write to any schema in the target database to remove the need for post load ETL jobs or triggers to move the data to the required schema

    Use Case: Adventure Works are recording data from tills, this includes sign-in and sales transactions, the stream analytics job can identify the difference between a login and sale, the login data needs to update the hr schema with an employee login while sales need to be…

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  11. Release of IP's or internal Stream Analytics via IotHub

    We have a scenario with Stream Analytics with iothub with access to the SQL Server database. IotHub with SQL Server is able to close a SQL Server firewall structure using Azure internal endpoints. For example, we do not have IotHub for Stream Analytics is unable to close because it does not have any IP or DNS in the SStream Analytics structure that it provides to close this communication via firewall.

    Do you want to know from Microsoft if there is any way to close these accesses within the Azure framework itself? or if there is an IP that or Stream…

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Security  ·  Flag idea as inappropriate…  ·  Admin →
  12. There should be Boolean support between Azure Stream Analytics and the Event Hub

    Currently, the Stream Analytic Job doesn't support Boolean variables as input (it converts boolean to 0/1).

    According to document, after stream analytics version 1.2 support boolean data type. I'm not sure which version do you use in event hub display stream analytics.

    document: https://docs.microsoft.com/en-us/stream-analytics-query/data-types-azure-stream-analytics

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  13. For Service bus topic (as output) with session enabled subscription and when set Session Id as System property then the state set as degrade

    For Service bus topic (as output) with session enabled subscription and when set Session Id as System property then the state set as degrade. However, the messages are still passes to those subscriptions but this is really confusing. Please let us know the reason for this

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  14. Azure Key Vault as reference data input

    I need to apply some values to my stream analytics query that are considered "secret" by our customer. So it would be great if I don't have to store them in plaintext but could just input them from Azure Key Vault into my query.

    4 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  15. Please add feature that can set Path pattern time zone as a JST.

    At now, when output at Azure Blob Storage/ADLS gen2, the path pattern time zone is UTC.
    But I want to set this time zone as a JST.
    Please add feature that can set Path pattern time zone as a JST.

    26 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  16. Ability to define parameters for multiple uses

    We have implemented some complex queries with several constants in their criteria/filters. These constants are also used in multiple stream analytics jobs.

    Having a way to define parameters or constants that can be defined once and used multiple times would simplify maintenance.

    This feature was covered by using UDF. But the issue here is functions in Stream Analytics require at least one argument or it throws a syntax error (the workaround was to declare "any" as an argument and pass any value).

    It would be convenient to be able to define and re-use parameters multiple times.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  17. Requesting Managed Service Identity to have RBAC at the directory level

    In the case that there are two types of users that need different permissions. They want RBAC control at the directory level not the container level to be able to provide more granularity in permissions for different users.

    Each of their directories have ACI RBAC control that is unique based on user and dir. However the ASA only allows MSI to specify Container for RBAC. RBAC container needs higher permissions that they do not want each user to have for every dir.

    I think for ADLS gen 2 it is technically possible to use ACL only for assigning permissions to…

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Security  ·  Flag idea as inappropriate…  ·  Admin →
  18. 1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  19. Better diagnostics

    I can see input, run queries, and test the output, but no data comes into the output. Diagnostics won't show any content, so all that's left is to give up trying to get this to work.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  20. Activity log - not showing actual error

    We had a conversion error and the Activity log only shows "OutputDataConversionError.TypeConversionError" which doesnt show the real reason.

    I had to enable Diagnostics logs to find real issue, which showed "Cannot convert from property 'multiplier' of type 'System.String' to column 'multiplier' of type 'System.Double'."

    If the real reason was in activity log would have saved me lots of time.

    7 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  • Don't see your idea?

Stream Analytics

Categories

Feedback and Knowledge Base