Stream Analytics

Azure Stream Analytics is a fully managed real-time stream computation service hosted in Microsoft Azure, which provides highly resilient, low latency, and scalable complex event processing of streaming data. Azure Stream Analytics enables developers to easily combine streams of data with historic records or reference data to derive business insights easily and quickly.

  • Hot ideas
  • Top ideas
  • New ideas
  • My feedback
  1. Enable a dynamic conversion from UTC to the local time

    When working with streaming data generated by a sensor. We used IoT Hub as the input for our Stream Analytics job and Power BI as the output.
    We wish to have Local time in our streaming dashboard in order to satisfy our customer’s needs. For the time being, we are using the DATEADD (Azure Stream Analytics) function to add the missing hours.

    We wish the solution to be a dynamic conversion between UTC (Coordinated Universal Time) and the user's local time.

    23 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    4 comments  ·  Flag idea as inappropriate…  ·  Admin →
  2. Invoke Functions in Batch & Parallel

    Current Functions Output executes one by one and ensures ordering. In our scenario, ordering is not important & we would like output to support.


    1. Batching & invoke function for each batch (Array as input)

    2. Invoke functions in parallel for non-ordering scenarios (may be as a configuration for output)

    12 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    2 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  3. Support SQL DB stored procedures as ASA Output

    Allow native support to Stored Procedures to be called from Stream Analytics. This way, we don't need to build custom solutions to trigger Stored Procedures when ASA generates O/P.

    16 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  4. Failed to Get Framework Assemblies Local Path During Pushing Edge Package??

    I am getting the above error while deploying ASA at edge. Any idea?

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  5. Support automatic restart with IoT Hub failover

    Azure Stream Analytics fails to received input from IoT Hub after an IoT Hub/Event Hub failover. the user will have to restart the job manually
    after failover. It is difficult for user to restart the job manually due to a midnight failover for a 24x7 ruiining system.
    Please consider automatic restart or restart button of Stream Analytics after IoT Hub failover.

    36 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Advanced Analytics  ·  Flag idea as inappropriate…  ·  Admin →
  6. Blob

    Infer timestamps from blob filenames instead of metadata. I just bulk uploaded my back data in order to test (one blob per day of data, with YYYY-MM-DD names). But ASA appears to be using the blob LastModified timestamp property to determine whether the data should be included or not. The problem is that the LastModified dates ended up out of order, and worse, are from today, not from the previous months from when the data was collected originally.

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  7. Allow REST Endpoints as reference data input

    Possibility to define REST endpoints as reference data inputs. Including refresh rate settings similar to SQL Database.

    41 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  8. Min/Max functions should take DateTime input

    Although this posted as completed, I'm currently receiving error messages telling me that min() and max() cannot take DateTime fields as input, using the latest version of ASA. In the function reference (https://docs.microsoft.com/en-us/stream-analytics-query/min-azure-stream-analytics), only bigint and float are listed as valid inputs.

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Query Language  ·  Flag idea as inappropriate…  ·  Admin →
  9. Output to Azure Postgres

    Similarly how Azure SQL is supported it would be great to have an output connector to Azure Postgres.

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  10. Rename Existing Stream Analytics Job

    Would like to be able to rename an existing stream analytics job. Without this, we have to delete existing and create a new one, copying everything over, and then we lose the ability to start "from last stopped"

    Obvious work around for this is making a note on when the job was stopped, and starting the new one from that time.

    But ideally, changing the name would be much simpler and less prone to error.

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  11. Stream Analytics Query Test Result inconsistent with actual job if german umlaute are used

    When using german Umlauts like (ä,ü,ö) in the query, the test result/preview will not match the expected result, but the real job will work.

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Query Language  ·  Flag idea as inappropriate…  ·  Admin →
  12. Output to ADLS gen 2

    Support for output to ADLS gen 2 in addition to existing inputs.

    22 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    started  ·  2 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  13. 1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  14. Stream Analytics VNet support

    Stream Analytics should integrate into Virtual Network (VNet) so I can access my services seemlessly without requiring cross-internet access.

    93 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    planned  ·  2 comments  ·  Security  ·  Flag idea as inappropriate…  ·  Admin →
  15. Cosmos DB as Reference Data

    I think Cosmos DB is the best solution for low latency reference data. For our IoT use case, we need to maintain state after certain condition met, and subsequent DeviceId will use that reference data in query.

    23 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  16. BUG: Reference Data from SQL database not working well

    I have a query where I use JOIN clause to join the reference data. Running the test with test files works. Reference Input SQL Query works and retrieves the results so that query is not the problem.
    When testing the ASA Job with real data, it does not retrieve the results, even I send the same message as in the test. Switching reference input to JSON file from BLOB storage makes it working

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  17. Bug: Reading the reference data test input from CSV file does not work. Switching back to JSON works well.

    Tried to test my query with IoT Hub as a Data Stream Source and SQL Database as a reference input. I have added the JSON message as a Data Stream test input and CSV file as the reference file input. Testing the query does not provide any results, neither errors. Converting the CSV file to JSON and re-running the test, works well.

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  18. How to check for data loss in Stream Analytics

    ◾️Configuration
    In the steps Azure IoT Hub-Azure Stream Analytics-Azure Data Lake Storage, the data of Device-to-Cloud Messaging of Azure IoT Hub is accumulated in Table Storage and Data Lake Storage.

    ◾️Problem
    When Runtime Errors occur in Stream Analytics in this configuration, I can see the details of the error in the activity log, but I don't know how to determine if this error is causing data loss.

    ◾️Request
    Instead of comparing the number of input and output data of Stream Analytics, I would like to compare the contents of data output from the IoT Hub with the contents of data…

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  19. Alert per partition

    I would like to create an alert per partition to get informed when a specific partition has high utilization or doesn't create any output event. In the production system, we had it last week that three partitions didn't created any output events over 5 1/2 days. With a retention period of 7 days in the eventhub, it was hard to reprocess the data within the timeframe.

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  20. Ability to see the events

    In the output boxes, one should be able to extract the top N events to see what data they produced

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
← Previous 1 3 4 5 11 12
  • Don't see your idea?

Stream Analytics

Categories

Feedback and Knowledge Base