Stream Analytics

Azure Stream Analytics is a fully managed real-time stream computation service hosted in Microsoft Azure, which provides highly resilient, low latency, and scalable complex event processing of streaming data. Azure Stream Analytics enables developers to easily combine streams of data with historic records or reference data to derive business insights easily and quickly.

  • Hot ideas
  • Top ideas
  • New ideas
  • My feedback
  1. Enable a dynamic conversion from UTC to the local time

    When working with streaming data generated by a sensor. We used IoT Hub as the input for our Stream Analytics job and Power BI as the output.
    We wish to have Local time in our streaming dashboard in order to satisfy our customer’s needs. For the time being, we are using the DATEADD (Azure Stream Analytics) function to add the missing hours.

    We wish the solution to be a dynamic conversion between UTC (Coordinated Universal Time) and the user's local time.

    23 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    2 comments  ·  Flag idea as inappropriate…  ·  Admin →
  2. Support automatic restart with IoT Hub failover

    Azure Stream Analytics fails to received input from IoT Hub after an IoT Hub/Event Hub failover. the user will have to restart the job manually
    after failover. It is difficult for user to restart the job manually due to a midnight failover for a 24x7 ruiining system.
    Please consider automatic restart or restart button of Stream Analytics after IoT Hub failover.

    36 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Advanced Analytics  ·  Flag idea as inappropriate…  ·  Admin →
  3. Allow REST Endpoints as reference data input

    Possibility to define REST endpoints as reference data inputs. Including refresh rate settings similar to SQL Database.

    40 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  4. Output to Azure Postgres

    Similarly how Azure SQL is supported it would be great to have an output connector to Azure Postgres.

    6 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  5. Output to ADLS gen 2

    Support for output to ADLS gen 2 in addition to existing inputs.

    22 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    started  ·  2 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  6. Support SQL DB stored procedures as ASA Output

    Allow native support to Stored Procedures to be called from Stream Analytics. This way, we don't need to build custom solutions to trigger Stored Procedures when ASA generates O/P.

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  7. 1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  8. Min/Max functions should take DateTime input

    Although this posted as completed, I'm currently receiving error messages telling me that min() and max() cannot take DateTime fields as input, using the latest version of ASA. In the function reference (https://docs.microsoft.com/en-us/stream-analytics-query/min-azure-stream-analytics), only bigint and float are listed as valid inputs.

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Query Language  ·  Flag idea as inappropriate…  ·  Admin →
  9. BUG: Reference Data from SQL database not working well

    I have a query where I use JOIN clause to join the reference data. Running the test with test files works. Reference Input SQL Query works and retrieves the results so that query is not the problem.
    When testing the ASA Job with real data, it does not retrieve the results, even I send the same message as in the test. Switching reference input to JSON file from BLOB storage makes it working

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  10. Bug: Reading the reference data test input from CSV file does not work. Switching back to JSON works well.

    Tried to test my query with IoT Hub as a Data Stream Source and SQL Database as a reference input. I have added the JSON message as a Data Stream test input and CSV file as the reference file input. Testing the query does not provide any results, neither errors. Converting the CSV file to JSON and re-running the test, works well.

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  11. How to check for data loss in Stream Analytics

    ◾️Configuration
    In the steps Azure IoT Hub-Azure Stream Analytics-Azure Data Lake Storage, the data of Device-to-Cloud Messaging of Azure IoT Hub is accumulated in Table Storage and Data Lake Storage.

    ◾️Problem
    When Runtime Errors occur in Stream Analytics in this configuration, I can see the details of the error in the activity log, but I don't know how to determine if this error is causing data loss.

    ◾️Request
    Instead of comparing the number of input and output data of Stream Analytics, I would like to compare the contents of data output from the IoT Hub with the contents of data…

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  12. Alert per partition

    I would like to create an alert per partition to get informed when a specific partition has high utilization or doesn't create any output event. In the production system, we had it last week that three partitions didn't created any output events over 5 1/2 days. With a retention period of 7 days in the eventhub, it was hard to reprocess the data within the timeframe.

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  13. Stream Analytics VNet support

    Stream Analytics should integrate into Virtual Network (VNet) so I can access my services seemlessly without requiring cross-internet access.

    85 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    planned  ·  2 comments  ·  Security  ·  Flag idea as inappropriate…  ·  Admin →
  14. Ability to see the events

    In the output boxes, one should be able to extract the top N events to see what data they produced

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  15. Cosmos DB as Reference Data

    I think Cosmos DB is the best solution for low latency reference data. For our IoT use case, we need to maintain state after certain condition met, and subsequent DeviceId will use that reference data in query.

    22 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  16. 1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  17. 1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Advanced Analytics  ·  Flag idea as inappropriate…  ·  Admin →
  18. Job Diagram

    This Job Diagram holds real promise as visual intuitive view of job execution and metrics especially during development. In it current form it's anything but that. When I select an output it seems to default to out of order events only.

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  19. Please add function that change time zone of activityLog.

    The JSON in activityLog shows time zone as a UTC.
    But I want to set that time zone as a JST.
    So, please add function that can change time zone of activityLog.

    4 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Advanced Analytics  ·  Flag idea as inappropriate…  ·  Admin →
  20. SQL Managed Instance Output

    Please investigate supporting SQL Managed Instance (as opposed to single DB) as an output for Stream Analytics.

    12 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
← Previous 1 3 4 5 11 12
  • Don't see your idea?

Stream Analytics

Categories

Feedback and Knowledge Base