Stream Analytics

Azure Stream Analytics is a fully managed real-time stream computation service hosted in Microsoft Azure, which provides highly resilient, low latency, and scalable complex event processing of streaming data. Azure Stream Analytics enables developers to easily combine streams of data with historic records or reference data to derive business insights easily and quickly.

  • Hot ideas
  • Top ideas
  • New ideas
  • My feedback
  1. Need to allow job output to be used as input to downstream jobs

    Today, if you want to have the output of one job be the input to another you must drop the output to blob storage or an event hub. The second job picks the data up from there. This adds 30-60s of latency per handoff.

    13 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    3 comments  ·  Flag idea as inappropriate…  ·  Admin →
  2. Management API may access ASA by authentication other than AAD

    Basically, AAD is for human login, when we're using management API, that means we're trying to access ASA by automation - any other authentication is better than AAD, for example, API-KEY etc.

    13 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  3. Restricted/Closed connections between Stream Analytics and Azure SQL Database

    Our customer wants more restricted/closed connections between Stream Analytics and Azure SQ Database. (ex. Restricted by same subscription )
    Currently, they need to set “Allow access to Azure services : ON” on their Azure SQL database(Logical Server), but this option means allow all connections from Azure IP, even if incoming from other customers. Several customer needs to allow to access within their subscription only to realize more secure.

    12 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  4. Support for gzip compression on Stream Analytics Output

    Add support to compress CSV output to Azure Data Lake. This will help with storage and retrieval greatly for data pipelines that can be processed easily through ASA. Compression of CSV makes the most sense due to the format being supported throughout the ecosystem U-SQL, Python and Pandas, etc.

    12 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  5. SQL Managed Instance Output

    Please investigate supporting SQL Managed Instance (as opposed to single DB) as an output for Stream Analytics.

    12 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  6. Why can't stream analytics support Apache kafka?

    It would be better if stream analytics support apache kafaka. We are worried that if we change the Event Hub to Kafka we end up re writing the consumers.

    11 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    2 comments  ·  Flag idea as inappropriate…  ·  Admin →
  7. Allow Avro Schema Definition in Reference Data Input

    Right now you require Avro schemas to be embedded in the incoming message payload. To reduce payload size, it would be great if you could allow Avro schemas to be defined in Reference Data and versioned. So in my Avro I could say schemaVersion=1.0 and then I can do a look up in my query for that schema and tell ASA to use that schema stream for that message.

    11 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  8. Allow UDF returning an array as parameter for 'CROSS APPLY GetArrayElements()'

    The following query claims an error "Function 'GetArrayElements' requires a field of type 'array' as a parameter."

    The function 'UDF.jsonify()' in fact is declared to return an array.

    SELECT
    event.device_id AS device_id,
    sensor.ArrayValue.sensor_id AS sensor_id,
    CAST(sensor.ArrayValue.value AS FLOAT) AS value_number
    INTO
    toSQL
    FROM fromIoTHub AS event
    CROSS APPLY GetArrayElements(UDF.jsonify(event.data)) AS sensor

    10 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  9. Input from a redis cache/queue

    It would be great if Azure Redis could be configured as an input source.

    10 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  10. Input Deserialize Error Handling

    ASA needs a way to handle "Could not deserialize the input event as...". Currently when this happens, the only way to resolve this error and get the stream working again is to create a custom receiver in .NET to find the event time that needs to be skipped and restart the job with a datetime after the bad event. This may mean that several days are missed based on retention because this error stops processing. Very frustrating when creating a streaming solution.

    10 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  11. Please restore "Output Last Processed Time" metric at job diagram

    I have check "Output Last Processed Time" everyday at job diagram.
    So, I want to be restored "Output Last Processed Time" metric at job diagram.

    10 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →

    Hi, thanks for your comments.
    We are making some changes to the job diagram, and unfortunately had to remove this information for now. We’re investigating how to add it back.
    What is your usage of Output Last Processed Time? If it’s to monitor if your streaming pipelines is acting as expected, you may want to use the “output watermak delay”. When this number goes up, it means your pipeline is getting delayed. More info here: https://azure.microsoft.com/en-us/blog/new-metric-in-azure-stream-analytics-tracks-latency-of-your-streaming-pipeline/
    Thanks,
    JS

  12. Fix the Ci /CD NuGet package so that it works in VSTS

    Please fix the CI / CD NuGet package. It currently breaks in VSTS failing to load the Newtonsoft.Json package.

    10 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    3 comments  ·  Flag idea as inappropriate…  ·  Admin →
  13. Allow Creation of Azure Storage outputs with Shared Access Signatures

    Currently creating an output to an Azure Storage account requires the account name and key.

    It would be really useful to be able to create those outputs using a SAS URL/Token to avoid unneeded sharing of keys either when they are recycled or when creating a new Stream Analytics output.

    9 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  14. High Resource Error Message

    During job execution for message on high resource utilization, the level shouldn't be "Informational" but Critical since it will lead to a delay or failed execution

    9 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  15. Convert function is missing for Hex values.

    Convert function is missing for Hex values. I want to do like... Convert(INT,Convert(VARBINARY,SUBSTRING(payload,3,2),2)) as Humidity.
    Also, Aggregate function should work for case like.... AVG (Humidity) AS [AverageHumidity]

    9 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  16. Allow TumblingWindow, HoppingWindow and SlidingWindow constructs to use dynamic values

    Allow TumblingWindow, HoppingWindow and SlidingWindow constructs to use dynamic values. That is allow these constructs to use a value from the query.

    SlidingWindow(MINUTE, casedata.slidingwindow)

    9 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  17. Ability to use more than one source for reference data

    It would be nice to be able to use "reference" data from more than one source. I've aware that the possibility of using SQL table and Storage table is under consideration but this is different.

    Usage scenario for an IoT application: you have inbound sensor data that you need to process. You have reference data (such as thresholds that are different for each sensor) and you have customer configurations that you need to check against the incoming data.
    Some of that data might be stored in a SQL table and other on Storage or is updated in Blob storage.
    In…

    9 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  18. How does one go about getting the client IP address for event data read from an event hub?

    We're sending event data to an event hub but we want to know the IP address of the client that sent the event. How do we get access to this? In particular, we want to see this output from a query in Stream Analytics?

    9 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  19. Scaling up Streaming units should be transparant and not require the job to be stopped

    Currently adjusting the streaming units requires the job to be stopped within the portal. It would be very beneficial if the scale could be adjusted on the fly and not require a job stop/start.

    8 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  20. Let us control whether the Power BI output enables a retention policy inside Power BI

    I believe the current private preview Power BI output inside ASA uses the basicFIFO retention policy inside Power BI described here:
    http://blogs.msdn.com/b/powerbidev/archive/2015/03/23/automatic-retention-policy-for-real-time-data.aspx

    Please enable a configuration option on the Power BI output in ASA so we can disable that retention policy.

    8 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    under review  ·  2 comments  ·  Flag idea as inappropriate…  ·  Admin →
  • Don't see your idea?

Stream Analytics

Categories

Feedback and Knowledge Base