Update: Microsoft will be moving away from UserVoice sites on a product-by-product basis throughout the 2021 calendar year. We will leverage 1st party solutions for customer feedback. Learn more here.

Stream Analytics

Azure Stream Analytics is a fully managed real-time stream computation service hosted in Microsoft Azure, which provides highly resilient, low latency, and scalable complex event processing of streaming data. Azure Stream Analytics enables developers to easily combine streams of data with historic records or reference data to derive business insights easily and quickly.

  • Hot ideas
  • Top ideas
  • New ideas
  • My feedback
  1. More secure connection options between Stream Analytics and Azure SQL Database

    Currently, Stream Analytics only supports Username and Password. Several customers need to support more secure connections to realize higher security model such as supporting MFA (certificate) model and Key Vault.

    15 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  2. True "pay by use" pricing for Power-BI Embedded (with stream analytics)

    Azure has become so popular partly due to the fair "pay by use" policy that it claims as one of its principles. Where is this with Power-BI Embedded (with stream analytics). I can't even get started with a real product for under 600 $/month? What if i need to start with 50 customers? In this case, I have to use something other than Power-BI, and it will be a pain, at least, to move to Power-BI once I am up to 1000 customers or so... Please stick to the Azure principles and make the "pay to play" game reasonable for…

    6 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  3. Allow CI/CD deployment without start

    Please make it possible to run a CI/CD deployment with the possibility to have the job stopped/not started after deployment

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  4. Multiple SA Jobs per Visual Studio project

    It would be convenient if it were possible to add multiple SA jobs (asasql) to one project, making it possible to reuse inputs, outputs and javascript functions. This would be even more conventient once c# development is possible with SA jobs.

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  5. persistent memory variable

    Our use case requires to declare memory variables such as an associative array and maintain the values. Values to be maintained and accessed across time windows. Such provision may require support for procedural code than SQL. We are considering Spark Streaming for this ease of use.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  6. timezone conversion function

    Provide datetime conversion function on timestamp from one timezone to another. Example, IoT data generated from one location needs to be preserved in both local and corporate timezone. Similar to TSQL AT TIME ZONE.

    2 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  7. Use TIMESTAMP BY clause to define the {date}/{time} tokens in the output sink to datalake

    I'm not sure if it does this already and I'm doing something wrong. I need to ingest historical data but I want it to go into the {date}/{time} folder structure in my data lake. I need to use the datetime column in the message rather than the enqued time or processed time because it is historical data from a few years back. I would have thought it would use the TIMESTAMP BY column to do it but evidently not.

    4 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  8. Stream Analytics on Azure Stack

    Stream Analytics should be on Azure Stack so that we can implement the IoT Reference Architecture https://aka.ms/iotrefarchitecture on-premises. IoT Hub coming to Azure Stack was already announced, now the Stream Analytics is the missing piece of the puzzle.

    21 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  9. Enable output message size of 1MB from Azure Stream Analytics to Premium Service Bus

    Premium Service Bus allows messages of up to 1MB but Stream Analytics only allows to send messages of 256KB to Premium Service Bus

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  10. Please make it faster to restart, pathetically slow

    Please make it faster to restart, pathetically slow

    4 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  11. Back-pressure on EventHub input

    We've noticed how the SA behaves on EventHub messages spikes and I guess no back-pressure mechanism is implemented on the Amqp Receiver. Please add the link crediting mechanism to ensure SA does not run out of memory (if this is the case) when a spike of messages occurs.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  12. Compression types as multiple option

    Right now, we can set the compression type in asa job as single option. I mean, we can tell asa job to treat all events as gzip, deflate, or none. In real scenario, the events input may be in various formats.
    It would be great to have this settings as multiple options.
    Right now to achieve this you have to create another asa job with different settings, and then both of them have a lot of warnings with wrong deserialization.

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  13. Ensure reasonable error codes while deploying SA and violating a policy

    • While trying to deploy a Stream Analytics Job, the deployment is prohibited due to an existing policy.
    • This was done on purpose, because I am responsible for the according policies in this subscription.
    • Thus, the problem is NOT, that I can’t deploy the SA Job.
    • The problem is, that the error code during the deployment is totally useless because it does not state the name or the ID of the policy which has been violated. (Please see the attached image)
    • In this particular case, I know which policy was responsible. But in other cases me or my colleagues won’t know…
    15 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  14. Allow designation of "non-critical" output, for which data can be discarded on overflow

    We use our Stream Analytics jobs in our alerting, to process tens of thousands of events into possible alerts. The stream analytics jobs write to multiple outputs; some of these are critical (containing the alerts), others are non-critical (container .e.g data to produce graphs for debugging/enrichting alerts).

    When the non-critical outputs fail, our job will stop working, thereby interrupting our alerting flow. However, this debug data could simply be discarded if that allows the other outputs to keep functioning.

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  15. Issue in renew-authorisation for stream analytics output - for datalake store

    I am trying to set up (deploy) a stream analytics job with data lake store as output using terraform. Terraform internally call the Azure API (through Azure Go SDK) for stream analytics to set the datalake store output that expects a refresh token.

    As per the API documentation, it is recommended to put a dummy string value here when creating the data source and then going to the Azure Portal to authenticate the data source which will update this property with a valid refresh token. Hence, I passed a "dummy" string as refresh token, and I could see the datalake…

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  16. Automated Query Testing

    Testing ASA queries requires manual interaction through the portal as described in https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-test-query

    It would be nice to be able to create automated tests.

    ie: an API to pass test input that returns output.

    8 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  17. Why can't stream analytics support Apache kafka?

    It would be better if stream analytics support apache kafaka. We are worried that if we change the Event Hub to Kafka we end up re writing the consumers.

    14 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    3 comments  ·  Flag idea as inappropriate…  ·  Admin →
  18. Azure Data Lake Store dynamic output path

    We want to store events to different folders based on a field - in a query.
    I can imagine the output as a function that can take some parameters

    e.g.
    SELECT * INTO [out($type)] FROM Input

    24 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  19. Allow TIMESTAMP BY with CROSS APPLY

    I have someincoming events where each event contains an array with the actual data events.

    So i do APPLY. But i want to define the TIMESTAMP BY for the resulting rows, as the timestamp property is nested inside this array.

    This would result in the following query:

    SELECT *
    FROM
    [input] AS i
    CROSS APPLY
    GetArrayElements(i.records) AS c TIMESTAMP BY c.ArrayValue.time

    but i cannot do this (syntax error)

    Alternative idea was to wrap the above query in a view, then define TIMESTAMP BY on the usage of the view:

    WITH view1
    AS (
    SELECT
    c.ArrayValue.*
    FROM
    [myi-vm-logs] As i
    CROSS…

    18 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  20. Add or delete input while job is running

    Allow to add/ delete inputs/ outputs of a running job without restarting.

    0 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  • Don't see your idea?

Stream Analytics

Categories

Feedback and Knowledge Base