Update: Microsoft will be moving away from UserVoice sites on a product-by-product basis throughout the 2021 calendar year. We will leverage 1st party solutions for customer feedback. Learn more here.

Stream Analytics

Azure Stream Analytics is a fully managed real-time stream computation service hosted in Microsoft Azure, which provides highly resilient, low latency, and scalable complex event processing of streaming data. Azure Stream Analytics enables developers to easily combine streams of data with historic records or reference data to derive business insights easily and quickly.

  • Hot ideas
  • Top ideas
  • New ideas
  • My feedback
  1. Input from a redis cache/queue

    It would be great if Azure Redis could be configured as an input source.

    10 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  2. I'm unable to get my query output saved to an azure sql table

    I am unable to get a SQL DB output to work with my analytics job. My job is storing Application Insights export data into a SQL table. The error I'm receiving is that the SQL initialization connection string is invalid. All of the inputs I'm prompted for are correct, and I'm unable to see what connection string the portal is generating for the analytics job. I need assistance determining why the job cannot store output into my table.

    Error details:

    Error:
    - Format of the initialization string does not conform to specification starting at index 87.

    Message:
    Azure SQL Server…

    10 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  3. Fix the Ci /CD NuGet package so that it works in VSTS

    Please fix the CI / CD NuGet package. It currently breaks in VSTS failing to load the Newtonsoft.Json package.

    10 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    3 comments  ·  Flag idea as inappropriate…  ·  Admin →
  4. SQL Managed Instance Output

    Please investigate supporting SQL Managed Instance (as opposed to single DB) as an output for Stream Analytics.

    10 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  5. Please restore "Output Last Processed Time" metric at job diagram

    I have check "Output Last Processed Time" everyday at job diagram.
    So, I want to be restored "Output Last Processed Time" metric at job diagram.

    10 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →

    Hi, thanks for your comments.
    We are making some changes to the job diagram, and unfortunately had to remove this information for now. We’re investigating how to add it back.
    What is your usage of Output Last Processed Time? If it’s to monitor if your streaming pipelines is acting as expected, you may want to use the “output watermak delay”. When this number goes up, it means your pipeline is getting delayed. More info here: https://azure.microsoft.com/en-us/blog/new-metric-in-azure-stream-analytics-tracks-latency-of-your-streaming-pipeline/
    Thanks,
    JS

  6. Input from Cosmos DB changefeed

    Support direct input from Cosmos DB changefeed.

    9 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  7. Convert function is missing for Hex values.

    Convert function is missing for Hex values. I want to do like... Convert(INT,Convert(VARBINARY,SUBSTRING(payload,3,2),2)) as Humidity.
    Also, Aggregate function should work for case like.... AVG (Humidity) AS [AverageHumidity]

    9 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  8. Ability to use more than one source for reference data

    It would be nice to be able to use "reference" data from more than one source. I've aware that the possibility of using SQL table and Storage table is under consideration but this is different.

    Usage scenario for an IoT application: you have inbound sensor data that you need to process. You have reference data (such as thresholds that are different for each sensor) and you have customer configurations that you need to check against the incoming data.
    Some of that data might be stored in a SQL table and other on Storage or is updated in Blob storage.
    In…

    9 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  9. Allow Creation of Azure Storage outputs with Shared Access Signatures

    Currently creating an output to an Azure Storage account requires the account name and key.

    It would be really useful to be able to create those outputs using a SAS URL/Token to avoid unneeded sharing of keys either when they are recycled or when creating a new Stream Analytics output.

    9 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  10. High Resource Error Message

    During job execution for message on high resource utilization, the level shouldn't be "Informational" but Critical since it will lead to a delay or failed execution

    9 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  11. More advance input path patterns for date and time.

    Currently, streaming inputs only support basic file path patterns for date and time. (yyyy/dd/mm, hh, etc). Unfortunately, many blob input prefixes do not fit that layout. For instance the native azure cloud likes write prefixes as follows: y={Year}/m={month}/d={day}/h={hour}/m={minute}.

    It would be good to be able to have finer grain control of the input pattern for time, similar to the way input patterns can be handled in data factory.

    9 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  12. Increase file size of ASA Parquet file output

    The current settings for Parquet output (min 10k rows, max 2h) generates a huge number of small parquet files (~10MB, written every 5 seconds) if the stream is 1k-50k rows / second. This huge number of small files (17k+ files per day) is very bad for further processing / analysis (e.g. with Spark or Synapse) and currently requires an additional step of file consolidation. Therefore it would be great to either increase the min. number of rows setting or introduce a new min. data size setting (file size might be too complex due to compression, but the data could be…

    9 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  13. Add KQL support

    KQL is such a powerfull language for working with time series data. In my opinion it is a very clean and transparent language for defining queries. It would be great to have KQL-like support for Stream Analytics on the edge and in the cloud.

    9 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Query Language  ·  Flag idea as inappropriate…  ·  Admin →
  14. Way better feedback is needed when there is a problem

    I have an issue right now. no matter what I do the job starts and goes immediately IDLE. There is ZERO feedback anywhere. The query passes, all connection tests pass. The logs show there are zero issues but its running.. however it doesn't DO anything at all.

    9 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    planned  ·  0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  15. R instead of SQL for Azure Stream Analytics queries

    Microsoft R is a must for queries of stream analytics!

    8 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  16. Scaling up Streaming units should be transparant and not require the job to be stopped

    Currently adjusting the streaming units requires the job to be stopped within the portal. It would be very beneficial if the scale could be adjusted on the fly and not require a job stop/start.

    8 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  17. Input from Azure Data Lake Storage

    Data Lake is not added as output which is great. But it should be added as input as well. Eg. we continuously save data to Data Lake Storage, and want to stream these to Power BI.

    8 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    2 comments  ·  Flag idea as inappropriate…  ·  Admin →
  18. Automated Query Testing

    Testing ASA queries requires manual interaction through the portal as described in https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-test-query

    It would be nice to be able to create automated tests.

    ie: an API to pass test input that returns output.

    8 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  19. Add support for compressed output

    I have historic data imported through spark in avro format to our Gen 1 Data Lake. I am now using ASA to stream the live data.
    Spark defaults to uses snappy to compress avro files where as ASA doesn't compress. This has resulted in a 10 fold increase in storage size and a slow down in access speed. Can you add support for compression of the outputs in ASA?

    8 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  20. Allow synchronous stopping/starting of ASA job via az or powershell

    Waiting for an ASA job to stop/start in order to deploy in a CI/CD scenario is far messier than it needs to be. Having to spin wait and repeatedly query Azure adds unneccessary clutter to devops pipelines.

    Please add a -Wait parameter to powershell, AND --wait to az stream-analytics.

    8 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  • Don't see your idea?

Stream Analytics

Categories

Feedback and Knowledge Base