Stream Analytics

Azure Stream Analytics is a fully managed real-time stream computation service hosted in Microsoft Azure, which provides highly resilient, low latency, and scalable complex event processing of streaming data. Azure Stream Analytics enables developers to easily combine streams of data with historic records or reference data to derive business insights easily and quickly.

  • Hot ideas
  • Top ideas
  • New ideas
  • My feedback
  1. Add Azure Service Bus Queues as Input

    Add Azure Service Bus Queues as Input for Azure Stream Analytics. It would allow us to consider Azure Service Bus more often as a viable solution instead of only Event Hubs.

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  2. Support input from Synapse Analytics table

    We should be able to directly stream newly added rows from Synapse Analytics tables to other destinations. The current solution I use is via an export to datalake files. But this is not a real time thing...
    Basically if you are able to write to Synapse Analytics, why not be able to read from it ?

    16 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  3. Allow ASA to acquire bearer token for authenticating itself to Azure Function protected by App Service Authentication

    A possible output sink for an ASA job is Azure Functions. However, if the Function is protected by App Service Authentication (or any type of OAuth authentication), the flow will not work since ASA doesn't have the functionality to fetch bearer tokens.

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Security  ·  Flag idea as inappropriate…  ·  Admin →
  4. Support Database Schema for SQL Database Sink

    First Error 'could not find table [schema.table]'
    Updated Table to schema].[table and new error

    I would like one security user for Streaming Analytics to be able to write to any schema in the target database to remove the need for post load ETL jobs or triggers to move the data to the required schema

    Use Case: Adventure Works are recording data from tills, this includes sign-in and sales transactions, the stream analytics job can identify the difference between a login and sale, the login data needs to update the hr schema with an employee login while sales need to be…

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  5. Release of IP's or internal Stream Analytics via IotHub

    We have a scenario with Stream Analytics with iothub with access to the SQL Server database. IotHub with SQL Server is able to close a SQL Server firewall structure using Azure internal endpoints. For example, we do not have IotHub for Stream Analytics is unable to close because it does not have any IP or DNS in the SStream Analytics structure that it provides to close this communication via firewall.

    Do you want to know from Microsoft if there is any way to close these accesses within the Azure framework itself? or if there is an IP that or Stream…

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Security  ·  Flag idea as inappropriate…  ·  Admin →
  6. There should be Boolean support between Azure Stream Analytics and the Event Hub

    Currently, the Stream Analytic Job doesn't support Boolean variables as input (it converts boolean to 0/1).

    According to document, after stream analytics version 1.2 support boolean data type. I'm not sure which version do you use in event hub display stream analytics.

    document: https://docs.microsoft.com/en-us/stream-analytics-query/data-types-azure-stream-analytics

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  7. For Service bus topic (as output) with session enabled subscription and when set Session Id as System property then the state set as degrade

    For Service bus topic (as output) with session enabled subscription and when set Session Id as System property then the state set as degrade. However, the messages are still passes to those subscriptions but this is really confusing. Please let us know the reason for this

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  8. Azure Key Vault as reference data input

    I need to apply some values to my stream analytics query that are considered "secret" by our customer. So it would be great if I don't have to store them in plaintext but could just input them from Azure Key Vault into my query.

    4 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  9. Please add feature that can set Path pattern time zone as a JST.

    At now, when output at Azure Blob Storage/ADLS gen2, the path pattern time zone is UTC.
    But I want to set this time zone as a JST.
    Please add feature that can set Path pattern time zone as a JST.

    23 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  10. Ability to define parameters for multiple uses

    We have implemented some complex queries with several constants in their criteria/filters. These constants are also used in multiple stream analytics jobs.

    Having a way to define parameters or constants that can be defined once and used multiple times would simplify maintenance.

    This feature was covered by using UDF. But the issue here is functions in Stream Analytics require at least one argument or it throws a syntax error (the workaround was to declare "any" as an argument and pass any value).

    It would be convenient to be able to define and re-use parameters multiple times.

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  11. Requesting Managed Service Identity to have RBAC at the directory level

    In the case that there are two types of users that need different permissions. They want RBAC control at the directory level not the container level to be able to provide more granularity in permissions for different users.

    Each of their directories have ACI RBAC control that is unique based on user and dir. However the ASA only allows MSI to specify Container for RBAC. RBAC container needs higher permissions that they do not want each user to have for every dir.

    I think for ADLS gen 2 it is technically possible to use ACL only for assigning permissions to…

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Security  ·  Flag idea as inappropriate…  ·  Admin →
  12. 1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  13. Better diagnostics

    I can see input, run queries, and test the output, but no data comes into the output. Diagnostics won't show any content, so all that's left is to give up trying to get this to work.

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  14. Activity log - not showing actual error

    We had a conversion error and the Activity log only shows "OutputDataConversionError.TypeConversionError" which doesnt show the real reason.

    I had to enable Diagnostics logs to find real issue, which showed "Cannot convert from property 'multiplier' of type 'System.String' to column 'multiplier' of type 'System.Double'."

    If the real reason was in activity log would have saved me lots of time.

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  15. Add comments, display file names and change title of query


    1. In job diagram, can we allow option to add mark up/comments , 2- Display file names if reference is chosen, Define query as "Process". It is defined as output

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  16. More advance input path patterns for date and time.

    Currently, streaming inputs only support basic file path patterns for date and time. (yyyy/dd/mm, hh, etc). Unfortunately, many blob input prefixes do not fit that layout. For instance the native azure cloud likes write prefixes as follows: y={Year}/m={month}/d={day}/h={hour}/m={minute}.

    It would be good to be able to have finer grain control of the input pattern for time, similar to the way input patterns can be handled in data factory.

    2 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  17. Fieldnames are different at runtime

    At runtime the fieldnames are different from the fieldnames at testing, because when you test the query all the fieldnames are lowercased. This is very confusing and causes problems. The field names when testing the data should be the same as when running the ASA job.

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  18. Stream Analytics for IoT Edge removes custom message properties / doesn't allow setting message properties

    Stream Analytics for IoT Edge seems to strip out custom IoT message properties set by other modules, and I can't seem to work out anyway in Stream Analytics Edge to set these properties again.

    I see for Cloud Hosted Stream Jobs you can configure these properties on the Output...but I can't work out how to do this for Edge jobs https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-define-outputs?#custom-metadata-properties-for-output

    I've opened an issue on Github IoT Edge repo with more details https://github.com/Azure/iotedge/issues/2846

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  19. Azure Stream Analytics to handle case sensitive columns (currently getting InputDeserializerError)

    Azure Stream Analytics currently does not support two column with names only differing by case as input.

    //input data
             ""shopperId"":""1234567"",
             ""shopperid"":"""",

    //ASA returns InputDeserializerError error in input

    {"Source":"eventhubinput","Type":"DataError","DataErrorType":"InputDeserializerError.InvalidData","BriefMessage":"Could not deserialize the input event(s) from resource 'Partition: [0], Offset: [12894685888], SequenceNumber: [104569]' as Json. Some possible reasons: 1) Malformed events 2) Input source configured with incorrect serialization format","Message":"Input Message Id: Partition: [0], Offset: [12894685888], SequenceNumber: [104569] Error: Index was out of range. Must be non-negative and less than the size of the collection.\r\nParameter name: index","ExampleEvents

    ASK to Azure Stream Analytics team

    Please consider to change Azure Stream Analytics to…

    72 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  20. Ability to write the output to Azure File Share

    Currently, I see ASA can only write to blob or database.. but not azure file share. Provide the ability to dump the messages data into Azure File share folders. This should include the following features:
    1. Ability to define the name
    2. Ability to define the operations like create, delete, update as part of file writing.

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
← Previous 1 3 4 5 12 13
  • Don't see your idea?

Stream Analytics

Categories

Feedback and Knowledge Base