Stream Analytics

Azure Stream Analytics is a fully managed real-time stream computation service hosted in Microsoft Azure, which provides highly resilient, low latency, and scalable complex event processing of streaming data. Azure Stream Analytics enables developers to easily combine streams of data with historic records or reference data to derive business insights easily and quickly.

  • Hot ideas
  • Top ideas
  • New ideas
  • My feedback
  1. Rename Existing Stream Analytics Job

    Would like to be able to rename an existing stream analytics job. Without this, we have to delete existing and create a new one, copying everything over, and then we lose the ability to start "from last stopped"

    Obvious work around for this is making a note on when the job was stopped, and starting the new one from that time.

    But ideally, changing the name would be much simpler and less prone to error.

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  2. Failed to Get Framework Assemblies Local Path During Pushing Edge Package??

    I am getting the above error while deploying ASA at edge. Any idea?

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  3. Invoke Functions in Batch & Parallel

    Current Functions Output executes one by one and ensures ordering. In our scenario, ordering is not important & we would like output to support.


    1. Batching & invoke function for each batch (Array as input)

    2. Invoke functions in parallel for non-ordering scenarios (may be as a configuration for output)

    12 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    2 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  4. Stream Analytics Query Test Result inconsistent with actual job if german umlaute are used

    When using german Umlauts like (ä,ü,ö) in the query, the test result/preview will not match the expected result, but the real job will work.

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Query Language  ·  Flag idea as inappropriate…  ·  Admin →
  5. 1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  6. Min/Max functions should take DateTime input

    Although this posted as completed, I'm currently receiving error messages telling me that min() and max() cannot take DateTime fields as input, using the latest version of ASA. In the function reference (https://docs.microsoft.com/en-us/stream-analytics-query/min-azure-stream-analytics), only bigint and float are listed as valid inputs.

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Query Language  ·  Flag idea as inappropriate…  ·  Admin →
  7. Output to Azure Postgres

    Similarly how Azure SQL is supported it would be great to have an output connector to Azure Postgres.

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  8. Enable a dynamic conversion from UTC to the local time

    When working with streaming data generated by a sensor. We used IoT Hub as the input for our Stream Analytics job and Power BI as the output.
    We wish to have Local time in our streaming dashboard in order to satisfy our customer’s needs. For the time being, we are using the DATEADD (Azure Stream Analytics) function to add the missing hours.

    We wish the solution to be a dynamic conversion between UTC (Coordinated Universal Time) and the user's local time.

    23 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    4 comments  ·  Flag idea as inappropriate…  ·  Admin →
  9. BUG: Reference Data from SQL database not working well

    I have a query where I use JOIN clause to join the reference data. Running the test with test files works. Reference Input SQL Query works and retrieves the results so that query is not the problem.
    When testing the ASA Job with real data, it does not retrieve the results, even I send the same message as in the test. Switching reference input to JSON file from BLOB storage makes it working

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  10. Bug: Reading the reference data test input from CSV file does not work. Switching back to JSON works well.

    Tried to test my query with IoT Hub as a Data Stream Source and SQL Database as a reference input. I have added the JSON message as a Data Stream test input and CSV file as the reference file input. Testing the query does not provide any results, neither errors. Converting the CSV file to JSON and re-running the test, works well.

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  11. How to check for data loss in Stream Analytics

    ◾️Configuration
    In the steps Azure IoT Hub-Azure Stream Analytics-Azure Data Lake Storage, the data of Device-to-Cloud Messaging of Azure IoT Hub is accumulated in Table Storage and Data Lake Storage.

    ◾️Problem
    When Runtime Errors occur in Stream Analytics in this configuration, I can see the details of the error in the activity log, but I don't know how to determine if this error is causing data loss.

    ◾️Request
    Instead of comparing the number of input and output data of Stream Analytics, I would like to compare the contents of data output from the IoT Hub with the contents of data…

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  12. Support SQL DB stored procedures as ASA Output

    Allow native support to Stored Procedures to be called from Stream Analytics. This way, we don't need to build custom solutions to trigger Stored Procedures when ASA generates O/P.

    16 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  13. Alert per partition

    I would like to create an alert per partition to get informed when a specific partition has high utilization or doesn't create any output event. In the production system, we had it last week that three partitions didn't created any output events over 5 1/2 days. With a retention period of 7 days in the eventhub, it was hard to reprocess the data within the timeframe.

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  14. Ability to see the events

    In the output boxes, one should be able to extract the top N events to see what data they produced

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  15. 1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  16. 1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Advanced Analytics  ·  Flag idea as inappropriate…  ·  Admin →
  17. Job Diagram

    This Job Diagram holds real promise as visual intuitive view of job execution and metrics especially during development. In it current form it's anything but that. When I select an output it seems to default to out of order events only.

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  18. Integration with deployed ML.NET models

    Integration with deployed ML.NET models

    It would be very neat and useful if I can deploy my custom ML.NET model to an Azure Function or elsewhere, and directly call it from Azure Stream Analytics. As nice as Azure Machine Learning Service is, it forces me to learn the Python eco-system.

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  19. improve sample data user experience

    Currently i get no feedback when sample data is uploaded or not.
    Also every time i update query, i am asked to upload sample data again which is time consuming activity.

    Even when i sample data from live stream, it still asks me to upload data from file.

    "Ocassionally", when same data is pulled from live stream. I get a notification to download this data. Once this notification disappears, there is no way to get that file again

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  20. Add support for compressed output

    I have historic data imported through spark in avro format to our Gen 1 Data Lake. I am now using ASA to stream the live data.
    Spark defaults to uses snappy to compress avro files where as ASA doesn't compress. This has resulted in a 10 fold increase in storage size and a slow down in access speed. Can you add support for compression of the outputs in ASA?

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
← Previous 1 3 4 5 11 12
  • Don't see your idea?

Stream Analytics

Categories

Feedback and Knowledge Base