Stream Analytics

Azure Stream Analytics is a fully managed real-time stream computation service hosted in Microsoft Azure, which provides highly resilient, low latency, and scalable complex event processing of streaming data. Azure Stream Analytics enables developers to easily combine streams of data with historic records or reference data to derive business insights easily and quickly.

  • Hot ideas
  • Top ideas
  • New ideas
  • My feedback
  1. Better diagnostics

    I can see input, run queries, and test the output, but no data comes into the output. Diagnostics won't show any content, so all that's left is to give up trying to get this to work.

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  2. Activity log - not showing actual error

    We had a conversion error and the Activity log only shows "OutputDataConversionError.TypeConversionError" which doesnt show the real reason.

    I had to enable Diagnostics logs to find real issue, which showed "Cannot convert from property 'multiplier' of type 'System.String' to column 'multiplier' of type 'System.Double'."

    If the real reason was in activity log would have saved me lots of time.

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  3. More advance input path patterns for date and time.

    Currently, streaming inputs only support basic file path patterns for date and time. (yyyy/dd/mm, hh, etc). Unfortunately, many blob input prefixes do not fit that layout. For instance the native azure cloud likes write prefixes as follows: y={Year}/m={month}/d={day}/h={hour}/m={minute}.

    It would be good to be able to have finer grain control of the input pattern for time, similar to the way input patterns can be handled in data factory.

    2 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  4. Fieldnames are different at runtime

    At runtime the fieldnames are different from the fieldnames at testing, because when you test the query all the fieldnames are lowercased. This is very confusing and causes problems. The field names when testing the data should be the same as when running the ASA job.

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  5. Stream Analytics for IoT Edge removes custom message properties / doesn't allow setting message properties

    Stream Analytics for IoT Edge seems to strip out custom IoT message properties set by other modules, and I can't seem to work out anyway in Stream Analytics Edge to set these properties again.

    I see for Cloud Hosted Stream Jobs you can configure these properties on the Output...but I can't work out how to do this for Edge jobs https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-define-outputs?#custom-metadata-properties-for-output

    I've opened an issue on Github IoT Edge repo with more details https://github.com/Azure/iotedge/issues/2846

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  6. Azure Stream Analytics to handle case sensitive columns (currently getting InputDeserializerError)

    Azure Stream Analytics currently does not support two column with names only differing by case as input.

    //input data
             ""shopperId"":""1234567"",
             ""shopperid"":"""",

    //ASA returns InputDeserializerError error in input

    {"Source":"eventhubinput","Type":"DataError","DataErrorType":"InputDeserializerError.InvalidData","BriefMessage":"Could not deserialize the input event(s) from resource 'Partition: [0], Offset: [12894685888], SequenceNumber: [104569]' as Json. Some possible reasons: 1) Malformed events 2) Input source configured with incorrect serialization format","Message":"Input Message Id: Partition: [0], Offset: [12894685888], SequenceNumber: [104569] Error: Index was out of range. Must be non-negative and less than the size of the collection.\r\nParameter name: index","ExampleEvents

    ASK to Azure Stream Analytics team

    Please consider to change Azure Stream Analytics to…

    68 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  7. Allow Azure Data Explorer (ADX) as a ouput

    We use Stream Analytics for alerting in combination with Azure functions and Azure Data Explorer is our central data store. Life would be easier, if we can output our alerts directly to Azure Data Explorer (ADX). This would also safe money, as for now we are putting an additional EventHub in place to mitigate the issue.

    4 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  8. Sample data time-range selection

    The date displayed in the "Select time range" dialog is in US date format. My browser and OS are set to NZ format, the international normal of dd-mm-yyyy. Please change the date display to match.

    Also, the time input is rather finicky, suggest an improvement on this one too. Would be good to specify the time as UTC as well as local time.

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  9. Managed identity for event hub output

    When the output is an event hub, we can currently only use SAS keys for authentication/authorization. Adding support for managed identity would significantly simplify and secure our setup.

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  10. Allow Azure Data Explorer (ADX) as a reference input

    We leverage Azure data explorer (ADX) as our main reporting database, and as such, we want to be able to use it as an input for reference data (as we store it there)

    4 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  11. Invoke Functions in Batch & Parallel

    Current Functions Output executes one by one and ensures ordering. In our scenario, ordering is not important & we would like output to support.


    1. Batching & invoke function for each batch (Array as input)

    2. Invoke functions in parallel for non-ordering scenarios (may be as a configuration for output)

    15 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    2 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  12. 10 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  13. Output to Azure Postgres

    Similarly how Azure SQL is supported it would be great to have an output connector to Azure Postgres.

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  14. BUG: Reference Data from SQL database not working well

    I have a query where I use JOIN clause to join the reference data. Running the test with test files works. Reference Input SQL Query works and retrieves the results so that query is not the problem.
    When testing the ASA Job with real data, it does not retrieve the results, even I send the same message as in the test. Switching reference input to JSON file from BLOB storage makes it working

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  15. Bug: Reading the reference data test input from CSV file does not work. Switching back to JSON works well.

    Tried to test my query with IoT Hub as a Data Stream Source and SQL Database as a reference input. I have added the JSON message as a Data Stream test input and CSV file as the reference file input. Testing the query does not provide any results, neither errors. Converting the CSV file to JSON and re-running the test, works well.

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  16. How to check for data loss in Stream Analytics

    ◾️Configuration
    In the steps Azure IoT Hub-Azure Stream Analytics-Azure Data Lake Storage, the data of Device-to-Cloud Messaging of Azure IoT Hub is accumulated in Table Storage and Data Lake Storage.

    ◾️Problem
    When Runtime Errors occur in Stream Analytics in this configuration, I can see the details of the error in the activity log, but I don't know how to determine if this error is causing data loss.

    ◾️Request
    Instead of comparing the number of input and output data of Stream Analytics, I would like to compare the contents of data output from the IoT Hub with the contents of data…

    2 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  17. Support SQL DB stored procedures as ASA Output

    Allow native support to Stored Procedures to be called from Stream Analytics. This way, we don't need to build custom solutions to trigger Stored Procedures when ASA generates O/P.

    20 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  18. Ability to see the events

    In the output boxes, one should be able to extract the top N events to see what data they produced

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  19. 1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  20. Job Diagram

    This Job Diagram holds real promise as visual intuitive view of job execution and metrics especially during development. In it current form it's anything but that. When I select an output it seems to default to out of order events only.

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
← Previous 1
  • Don't see your idea?

Stream Analytics

Categories

Feedback and Knowledge Base