Stream Analytics

Azure Stream Analytics is a fully managed real-time stream computation service hosted in Microsoft Azure, which provides highly resilient, low latency, and scalable complex event processing of streaming data. Azure Stream Analytics enables developers to easily combine streams of data with historic records or reference data to derive business insights easily and quickly.

  • Hot ideas
  • Top ideas
  • New ideas
  • My feedback
  1. Which input caused the issue

    ASA ran into runtime error and we want to know which input/field caused this problem, so that we can fix the source to make sure these values arent emitted.

    The input deserialization errors are already captured as part of the logs and only runtime errors are not captured as part of the logs (e.g cast errors).

    4 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  2. 4 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  3. deviceConnect

    properties_s {"deviceId":"D-0000001","protocol":"Mqtt","authType":null,"maskedIpAddress":"40.113.245.***","statusCode":"404"}

    i am gatting error like this process data from iot hub to stream job

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  4. Increase file size of ASA Parquet file output

    The current settings for Parquet output (min 10k rows, max 2h) generates a huge number of small parquet files (~10MB, written every 5 seconds) if the stream is 1k-50k rows / second. This huge number of small files (17k+ files per day) is very bad for further processing / analysis (e.g. with Spark or Synapse) and currently requires an additional step of file consolidation. Therefore it would be great to either increase the min. number of rows setting or introduce a new min. data size setting (file size might be too complex due to compression, but the data could be…

    6 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  5. 3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  6. Read Application Properties (Device Properties) from Iot Hub event endpoint in StreamAnalytic

    I want store telemetry in azure sql db.so I got input from Iot Hub built-in event endpoint in stream analytics (Integrated in sql db) and want to process application properpties that attached to telemetry from iot hub message enrichemnt. I research alot in UDF, UDA, built-in function stream analytics but couldnt even see application properties in stream analytic. So how can i do that? Is this possible?

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  7. Add Azure Service Bus Queues as Input

    Add Azure Service Bus Queues as Input for Azure Stream Analytics. It would allow us to consider Azure Service Bus more often as a viable solution instead of only Event Hubs.

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  8. Support input from Synapse Analytics table

    We should be able to directly stream newly added rows from Synapse Analytics tables to other destinations. The current solution I use is via an export to datalake files. But this is not a real time thing...
    Basically if you are able to write to Synapse Analytics, why not be able to read from it ?

    16 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  9. Support Database Schema for SQL Database Sink

    First Error 'could not find table [schema.table]'
    Updated Table to schema].[table and new error

    I would like one security user for Streaming Analytics to be able to write to any schema in the target database to remove the need for post load ETL jobs or triggers to move the data to the required schema

    Use Case: Adventure Works are recording data from tills, this includes sign-in and sales transactions, the stream analytics job can identify the difference between a login and sale, the login data needs to update the hr schema with an employee login while sales need to be…

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  10. For Service bus topic (as output) with session enabled subscription and when set Session Id as System property then the state set as degrade

    For Service bus topic (as output) with session enabled subscription and when set Session Id as System property then the state set as degrade. However, the messages are still passes to those subscriptions but this is really confusing. Please let us know the reason for this

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  11. Azure Key Vault as reference data input

    I need to apply some values to my stream analytics query that are considered "secret" by our customer. So it would be great if I don't have to store them in plaintext but could just input them from Azure Key Vault into my query.

    4 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  12. Please add feature that can set Path pattern time zone as a JST.

    At now, when output at Azure Blob Storage/ADLS gen2, the path pattern time zone is UTC.
    But I want to set this time zone as a JST.
    Please add feature that can set Path pattern time zone as a JST.

    23 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  13. Better diagnostics

    I can see input, run queries, and test the output, but no data comes into the output. Diagnostics won't show any content, so all that's left is to give up trying to get this to work.

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  14. Activity log - not showing actual error

    We had a conversion error and the Activity log only shows "OutputDataConversionError.TypeConversionError" which doesnt show the real reason.

    I had to enable Diagnostics logs to find real issue, which showed "Cannot convert from property 'multiplier' of type 'System.String' to column 'multiplier' of type 'System.Double'."

    If the real reason was in activity log would have saved me lots of time.

    4 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  15. More advance input path patterns for date and time.

    Currently, streaming inputs only support basic file path patterns for date and time. (yyyy/dd/mm, hh, etc). Unfortunately, many blob input prefixes do not fit that layout. For instance the native azure cloud likes write prefixes as follows: y={Year}/m={month}/d={day}/h={hour}/m={minute}.

    It would be good to be able to have finer grain control of the input pattern for time, similar to the way input patterns can be handled in data factory.

    2 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  16. Fieldnames are different at runtime

    At runtime the fieldnames are different from the fieldnames at testing, because when you test the query all the fieldnames are lowercased. This is very confusing and causes problems. The field names when testing the data should be the same as when running the ASA job.

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  17. Stream Analytics for IoT Edge removes custom message properties / doesn't allow setting message properties

    Stream Analytics for IoT Edge seems to strip out custom IoT message properties set by other modules, and I can't seem to work out anyway in Stream Analytics Edge to set these properties again.

    I see for Cloud Hosted Stream Jobs you can configure these properties on the Output...but I can't work out how to do this for Edge jobs https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-define-outputs?#custom-metadata-properties-for-output

    I've opened an issue on Github IoT Edge repo with more details https://github.com/Azure/iotedge/issues/2846

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  18. Azure Stream Analytics to handle case sensitive columns (currently getting InputDeserializerError)

    Azure Stream Analytics currently does not support two column with names only differing by case as input.

    //input data
             ""shopperId"":""1234567"",
             ""shopperid"":"""",

    //ASA returns InputDeserializerError error in input

    {"Source":"eventhubinput","Type":"DataError","DataErrorType":"InputDeserializerError.InvalidData","BriefMessage":"Could not deserialize the input event(s) from resource 'Partition: [0], Offset: [12894685888], SequenceNumber: [104569]' as Json. Some possible reasons: 1) Malformed events 2) Input source configured with incorrect serialization format","Message":"Input Message Id: Partition: [0], Offset: [12894685888], SequenceNumber: [104569] Error: Index was out of range. Must be non-negative and less than the size of the collection.\r\nParameter name: index","ExampleEvents

    ASK to Azure Stream Analytics team

    Please consider to change Azure Stream Analytics to…

    80 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  19. Allow Azure Data Explorer (ADX) as a ouput

    We use Stream Analytics for alerting in combination with Azure functions and Azure Data Explorer is our central data store. Life would be easier, if we can output our alerts directly to Azure Data Explorer (ADX). This would also safe money, as for now we are putting an additional EventHub in place to mitigate the issue.

    4 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  20. Sample data time-range selection

    The date displayed in the "Select time range" dialog is in US date format. My browser and OS are set to NZ format, the international normal of dd-mm-yyyy. Please change the date display to match.

    Also, the time input is rather finicky, suggest an improvement on this one too. Would be good to specify the time as UTC as well as local time.

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
← Previous 1 3
  • Don't see your idea?

Stream Analytics

Categories

Feedback and Knowledge Base