Stream Analytics

Azure Stream Analytics is a fully managed real-time stream computation service hosted in Microsoft Azure, which provides highly resilient, low latency, and scalable complex event processing of streaming data. Azure Stream Analytics enables developers to easily combine streams of data with historic records or reference data to derive business insights easily and quickly.

  • Hot ideas
  • Top ideas
  • New ideas
  • My feedback
  1. Improvements in Reference Input

    Following are my Inputs for Stream Analytics:

    Input 1: Stream Input from IoT Hub
    Input 2: Reference Data from Azure Blob Storage

    My Query:
    SELECT

    din.deviceid as streamdeviceid,
    
    din.heartrate as streamheartrate,
    refin.deviceid as refdeviceid,
    refin.patientid as refpatientid,
    din.srcdatetime,
    din.EventProcessedUtcTime

    FROM

    [asa-in-iot-healthcare-rpm] din 
    
    TIMESTAMP BY srcdatetime

    Left outer JOIN

    [asa-ref-in-patient-device-mapping] refin
    
    ON refin.deviceid = din.deviceid

    Giving Error:
    The join predicate is not time bounded. JOIN operation between data streams requires specifying max time distances between matching events. Please add DATEDIFF to the JOIN condition. Example: SELECT input1.a, input2.b FROM input1 JOIN input2 ON DATEDIFF(minute, input1, input2) BETWEEN 0 AND 10

    Modified…

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  2. Display scale from main screen

    Minor UI Enhancement

    Would be nice to see what a stream analytics job current scale is without having to go into the sub screen.

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  3. Which input caused the issue

    ASA ran into runtime error and we want to know which input/field caused this problem, so that we can fix the source to make sure these values arent emitted.

    The input deserialization errors are already captured as part of the logs and only runtime errors are not captured as part of the logs (e.g cast errors).

    6 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  4. Add KQL support

    KQL is such a powerfull language for working with time series data. In my opinion it is a very clean and transparent language for defining queries. It would be great to have KQL-like support for Stream Analytics on the edge and in the cloud.

    9 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Query Language  ·  Flag idea as inappropriate…  ·  Admin →
  5. 4 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  6. deviceConnect

    properties_s {"deviceId":"D-0000001","protocol":"Mqtt","authType":null,"maskedIpAddress":"40.113.245.***","statusCode":"404"}

    i am gatting error like this process data from iot hub to stream job

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  7. Add ALWAYS TRUE pattern to MATCH_RECOGNIZE

    using matchrecognize to detect "V" shape data change is very convenient, however it will miss the first value as "delta" calculation starts only when data is changed.
    you can find a good explanation of ALWAYS TRUE in MATCH
    RECOGNIZE at https://www.oracle.com/technetwork/database/bi-datawarehousing/mr-deep-dive-3769287.pdf

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Query Language  ·  Flag idea as inappropriate…  ·  Admin →
  8. Increase file size of ASA Parquet file output

    The current settings for Parquet output (min 10k rows, max 2h) generates a huge number of small parquet files (~10MB, written every 5 seconds) if the stream is 1k-50k rows / second. This huge number of small files (17k+ files per day) is very bad for further processing / analysis (e.g. with Spark or Synapse) and currently requires an additional step of file consolidation. Therefore it would be great to either increase the min. number of rows setting or introduce a new min. data size setting (file size might be too complex due to compression, but the data could be…

    6 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  9. 3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  10. please add feature that could specify file name when output to Azure Blob storage

    at now, Stream Analytics is not support to specify custom file name when output to Azure Blob Storage.
    I hope to specify output file name with timestamp like a "YYYYMMDDMMSS.JSON" when output to Azure Blob Storage.

    11 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Advanced Analytics  ·  Flag idea as inappropriate…  ·  Admin →
  11. please add feature that could specify file name when output to Azure Blob storage

    at now, Stream Analytics is not support to specify custom file name when output to Azure Blob Storage.
    I hope to specify output file name with timestamp like a "YYYYMMDDMMSS.JSON" when output to Azure Blob Storage.

    0 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Advanced Analytics  ·  Flag idea as inappropriate…  ·  Admin →
  12. Read Application Properties (Device Properties) from Iot Hub event endpoint in StreamAnalytic

    I want store telemetry in azure sql db.so I got input from Iot Hub built-in event endpoint in stream analytics (Integrated in sql db) and want to process application properpties that attached to telemetry from iot hub message enrichemnt. I research alot in UDF, UDA, built-in function stream analytics but couldnt even see application properties in stream analytic. So how can i do that? Is this possible?

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  13. Add Azure Service Bus Queues as Input

    Add Azure Service Bus Queues as Input for Azure Stream Analytics. It would allow us to consider Azure Service Bus more often as a viable solution instead of only Event Hubs.

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  14. Support input from Synapse Analytics table

    We should be able to directly stream newly added rows from Synapse Analytics tables to other destinations. The current solution I use is via an export to datalake files. But this is not a real time thing...
    Basically if you are able to write to Synapse Analytics, why not be able to read from it ?

    16 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  15. Allow ASA to acquire bearer token for authenticating itself to Azure Function protected by App Service Authentication

    A possible output sink for an ASA job is Azure Functions. However, if the Function is protected by App Service Authentication (or any type of OAuth authentication), the flow will not work since ASA doesn't have the functionality to fetch bearer tokens.

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Security  ·  Flag idea as inappropriate…  ·  Admin →
  16. Support Database Schema for SQL Database Sink

    First Error 'could not find table [schema.table]'
    Updated Table to schema].[table and new error

    I would like one security user for Streaming Analytics to be able to write to any schema in the target database to remove the need for post load ETL jobs or triggers to move the data to the required schema

    Use Case: Adventure Works are recording data from tills, this includes sign-in and sales transactions, the stream analytics job can identify the difference between a login and sale, the login data needs to update the hr schema with an employee login while sales need to be…

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  17. Release of IP's or internal Stream Analytics via IotHub

    We have a scenario with Stream Analytics with iothub with access to the SQL Server database. IotHub with SQL Server is able to close a SQL Server firewall structure using Azure internal endpoints. For example, we do not have IotHub for Stream Analytics is unable to close because it does not have any IP or DNS in the SStream Analytics structure that it provides to close this communication via firewall.

    Do you want to know from Microsoft if there is any way to close these accesses within the Azure framework itself? or if there is an IP that or Stream…

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Security  ·  Flag idea as inappropriate…  ·  Admin →
  18. There should be Boolean support between Azure Stream Analytics and the Event Hub

    Currently, the Stream Analytic Job doesn't support Boolean variables as input (it converts boolean to 0/1).

    According to document, after stream analytics version 1.2 support boolean data type. I'm not sure which version do you use in event hub display stream analytics.

    document: https://docs.microsoft.com/en-us/stream-analytics-query/data-types-azure-stream-analytics

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  19. For Service bus topic (as output) with session enabled subscription and when set Session Id as System property then the state set as degrade

    For Service bus topic (as output) with session enabled subscription and when set Session Id as System property then the state set as degrade. However, the messages are still passes to those subscriptions but this is really confusing. Please let us know the reason for this

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  20. Azure Key Vault as reference data input

    I need to apply some values to my stream analytics query that are considered "secret" by our customer. So it would be great if I don't have to store them in plaintext but could just input them from Azure Key Vault into my query.

    4 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
← Previous 1 3 4 5 13 14
  • Don't see your idea?

Stream Analytics

Categories

Feedback and Knowledge Base