Update: Microsoft will be moving away from UserVoice sites on a product-by-product basis throughout the 2021 calendar year. We will leverage 1st party solutions for customer feedback. Learn more here.

Stream Analytics

Azure Stream Analytics is a fully managed real-time stream computation service hosted in Microsoft Azure, which provides highly resilient, low latency, and scalable complex event processing of streaming data. Azure Stream Analytics enables developers to easily combine streams of data with historic records or reference data to derive business insights easily and quickly.

  • Hot ideas
  • Top ideas
  • New ideas
  • My feedback
  1. Allow REST Endpoints as reference data input

    Possibility to define REST endpoints as reference data inputs. Including refresh rate settings similar to SQL Database.

    42 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  2. Foresee event grid output for Azure Stream Analytics

    Azure Event Grid really helps in building distributed, event-driven, applications on Azure. Having out of the box support to publish selected events from Azure Stream Analytics to Azure Event Grid would be awesome!

    To achieve this today, I have to call (and deploy + pay) an Azure function that takes the message and call the actual Azure Event Grid API... It would be so much easier to have this out of the box and this would benefit for future out of the box event grid subscribers.

    40 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  3. Allow specifying columns when using INTO to insert into ta Sql Azure database.

    When using INTO
    you cannot specify which columns to use - everything is in order. This means that any desired change to the database table becomes hard to manage with Stream Analytics.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  4. Loading Partition Information For the Job

    Does not ever finish loading (my job is comprised of an Event Hub input, a SQL transformation, and a Azure SQL Output).

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  5. Azure Stream Analytics required major improvement

    Azure Stream analytics is good for smal and simple use case but still far away with CEP (Spark, Flink and google Flow). As of now we used stream analytics 1 out of 20 projects because of limited feature like Join in Stream (2 devices), Manage Custom State, Job crash etc.) I love Azure PaaS service, request you to add more feature so we don't need to go other cloud providers or IaaS :)

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  6. Cosmos DB as Reference Data

    I think Cosmos DB is the best solution for low latency reference data. For our IoT use case, we need to maintain state after certain condition met, and subsequent DeviceId will use that reference data in query.

    27 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  7. SQL Managed Instance Output

    Please investigate supporting SQL Managed Instance (as opposed to single DB) as an output for Stream Analytics.

    10 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  8. Monitor overall SU usage over time, not just per individual SA job.

    Spinning a new stream analytics job can occur at many times. Sometime you risk reaching your SU limit (200 by default). It happened to me... I am missing the ability to monitor how many SU I am currently using. Something similar to SU% utilization that is available per SA job- but on the subscription level.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  9. Please restore "Output Last Processed Time" metric at job diagram

    I have check "Output Last Processed Time" everyday at job diagram.
    So, I want to be restored "Output Last Processed Time" metric at job diagram.

    10 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →

    Hi, thanks for your comments.
    We are making some changes to the job diagram, and unfortunately had to remove this information for now. We’re investigating how to add it back.
    What is your usage of Output Last Processed Time? If it’s to monitor if your streaming pipelines is acting as expected, you may want to use the “output watermak delay”. When this number goes up, it means your pipeline is getting delayed. More info here: https://azure.microsoft.com/en-us/blog/new-metric-in-azure-stream-analytics-tracks-latency-of-your-streaming-pipeline/
    Thanks,
    JS

  10. Please add feature that can use "Identity" at Stream Analytics side

    This "Identity" is the feature of SQL Database. Like a "Identity(1,1)".
    when I use both outputs different from SQL DB and SQL DB, I want to assign the same ID number, so please implement this Identity function on SA side.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  11. limit output event data batch size

    Allow user to configure the number of output events that can go in a single event data. As of now, for EVent Hub as output, format of JSON output (new line or array) can be configured but there is no way to configure how many events can be batched. This would allow the user to control the throughput based on needs and the consumer of event hub can have consistent behavior on the max events (or one event) per message.

    32 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  12. please add timestamp configuration that can select time-zone.

    please add timestamp configuration that can select time-zone.
    for example JST, etc.. not only UTC.

    2 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  13. Obfuscation or encryption of query code

    Will it be possible in a future release of Stream Analytics to either obfuscate or encrypt the query (or javascript or C#) code of SA jobs. We are looking into the possibility to implement our SA jobs/models, which we use for predictive maintenance for instance, besides our own subscriptions, in subscriptions of customers as well. But we would not want them to be able to see the code of our queries or, because these contain proprietary logic, rules or definitions.

    6 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  14. How about GetArrayElement that does not do a positional retrieval?

    I faced today the issue of having to pick a JSON from an array where all the elements are of JSON type. GetArrayElement picks an element by position and I was not very happy with the prospect of having my query not working if for some reason the order of the elements in the array changes. In the circumstances I wrote the JavaScript function below:

    // array contains JSON elements. The function returns the first element that has the 'key' key.
    function main(array, key) {

    function searchFunction(value, index, array) {
    
    return undefined !== value[key];
    }

    var result = array.find(searchFunction);
    return…

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  15. SA doesn't support to specify event hub endpoint explicitly

    currently, SA in one cloud is limited to use the data source instance from the same cloud. nevertheless, it would be more flexible if the SA allow to specify the endpoint like pointing to a data source instance in another sovereign cloud.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  16. Enhance Security for ASA, Managed Services Identities , Key Vault Bindings

    Add support for managed services identities and storing input and output configuration details and secrets in Key Vault.

    42 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    3 comments  ·  Security  ·  Flag idea as inappropriate…  ·  Admin →
  17. Allow UDF returning an array as parameter for 'CROSS APPLY GetArrayElements()'

    The following query claims an error "Function 'GetArrayElements' requires a field of type 'array' as a parameter."

    The function 'UDF.jsonify()' in fact is declared to return an array.

    SELECT

    event.device_id AS device_id,
    
    sensor.ArrayValue.sensor_id AS sensor_id,
    CAST(sensor.ArrayValue.value AS FLOAT) AS value_number

    INTO
    toSQL
    FROM fromIoTHub AS event
    CROSS APPLY GetArrayElements(UDF.jsonify(event.data)) AS sensor

    15 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  18. Restricted/Closed connections between Stream Analytics and Azure SQL Database

    Our customer wants more restricted/closed connections between Stream Analytics and Azure SQ Database. (ex. Restricted by same subscription )
    Currently, they need to set “Allow access to Azure services : ON” on their Azure SQL database(Logical Server), but this option means allow all connections from Azure IP, even if incoming from other customers. Several customer needs to allow to access within their subscription only to realize more secure.

    12 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  19. More secure connection options between Stream Analytics and Azure SQL Database

    Currently, Stream Analytics only supports Username and Password. Several customers need to support more secure connections to realize higher security model such as supporting MFA (certificate) model and Key Vault.

    15 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  20. True "pay by use" pricing for Power-BI Embedded (with stream analytics)

    Azure has become so popular partly due to the fair "pay by use" policy that it claims as one of its principles. Where is this with Power-BI Embedded (with stream analytics). I can't even get started with a real product for under 600 $/month? What if i need to start with 50 customers? In this case, I have to use something other than Power-BI, and it will be a pain, at least, to move to Power-BI once I am up to 1000 customers or so... Please stick to the Azure principles and make the "pay to play" game reasonable for…

    6 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  • Don't see your idea?

Stream Analytics

Categories

Feedback and Knowledge Base