Stream Analytics

Azure Stream Analytics is a fully managed real-time stream computation service hosted in Microsoft Azure, which provides highly resilient, low latency, and scalable complex event processing of streaming data. Azure Stream Analytics enables developers to easily combine streams of data with historic records or reference data to derive business insights easily and quickly.

  • Hot ideas
  • Top ideas
  • New ideas
  • My feedback
  1. Invoke Functions in Batch & Parallel

    Current Functions Output executes one by one and ensures ordering. In our scenario, ordering is not important & we would like output to support.


    1. Batching & invoke function for each batch (Array as input)

    2. Invoke functions in parallel for non-ordering scenarios (may be as a configuration for output)

    15 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    2 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  2. Provide mechanism to cache output state or values in a fast storage like redis.

    this would help in cases for comparing preceding values with out self joins.

    15 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    under review  ·  0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  3. More secure connection options between Stream Analytics and Azure SQL Database

    Currently, Stream Analytics only supports Username and Password. Several customers need to support more secure connections to realize higher security model such as supporting MFA (certificate) model and Key Vault.

    15 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  4. Add Output to Azure Storage Queue

    Allow output to be sent to an azure storage queue. The use case would be to trigger an azure function/signalr from the azure storage queue when a certain condition is found in the stream. Currently we are writing data to azure table storage and if a specific condition is found, we would like to trigger another action or further processing of the row. Right now we have to write to a service bus topic/queue, develop a listener/trigger function application, and then process the record. Enabling storage queue allows us to use only 1 storage account to hold the raw data,…

    15 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  5. Why can't stream analytics support Apache kafka?

    It would be better if stream analytics support apache kafaka. We are worried that if we change the Event Hub to Kafka we end up re writing the consumers.

    14 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    3 comments  ·  Flag idea as inappropriate…  ·  Admin →
  6. How does one go about getting the client IP address for event data read from an event hub?

    We're sending event data to an event hub but we want to know the IP address of the client that sent the event. How do we get access to this? In particular, we want to see this output from a query in Stream Analytics?

    14 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    2 comments  ·  Flag idea as inappropriate…  ·  Admin →
  7. Add Query Parameters to the SA job configuration settings.

    Create query parameters (stored in configuration settings) for the SA job so that common values can be controlled without modifying the query itself.

    13 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    2 comments  ·  Flag idea as inappropriate…  ·  Admin →
  8. Output Error Rows to a Queue or Blob

    Create a stream for unsuccessful rows that can be sent to an output preferably a queue or blob. Currently I don't know which rows errored

    13 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    under review  ·  2 comments  ·  Flag idea as inappropriate…  ·  Admin →
  9. Input Deserialize Error Handling

    ASA needs a way to handle "Could not deserialize the input event as...". Currently when this happens, the only way to resolve this error and get the stream working again is to create a custom receiver in .NET to find the event time that needs to be skipped and restart the job with a datetime after the bad event. This may mean that several days are missed based on retention because this error stops processing. Very frustrating when creating a streaming solution.

    13 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  10. mongo

    Add an output to a MongoDb collection. So Stream Analytics can store information into a NoSql Document based DB, such as MongoDB.

    13 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    2 comments  ·  Flag idea as inappropriate…  ·  Admin →
  11. Need to allow job output to be used as input to downstream jobs

    Today, if you want to have the output of one job be the input to another you must drop the output to blob storage or an event hub. The second job picks the data up from there. This adds 30-60s of latency per handoff.

    13 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    3 comments  ·  Flag idea as inappropriate…  ·  Admin →
  12. Allow TumblingWindow, HoppingWindow and SlidingWindow constructs to use dynamic values

    Allow TumblingWindow, HoppingWindow and SlidingWindow constructs to use dynamic values. That is allow these constructs to use a value from the query.

    SlidingWindow(MINUTE, casedata.slidingwindow)

    13 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  13. Management API may access ASA by authentication other than AAD

    Basically, AAD is for human login, when we're using management API, that means we're trying to access ASA by automation - any other authentication is better than AAD, for example, API-KEY etc.

    13 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  14. Restricted/Closed connections between Stream Analytics and Azure SQL Database

    Our customer wants more restricted/closed connections between Stream Analytics and Azure SQ Database. (ex. Restricted by same subscription )
    Currently, they need to set “Allow access to Azure services : ON” on their Azure SQL database(Logical Server), but this option means allow all connections from Azure IP, even if incoming from other customers. Several customer needs to allow to access within their subscription only to realize more secure.

    12 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  15. Rename Existing Stream Analytics Job

    Would like to be able to rename an existing stream analytics job. Without this, we have to delete existing and create a new one, copying everything over, and then we lose the ability to start "from last stopped"

    Obvious work around for this is making a note on when the job was stopped, and starting the new one from that time.

    But ideally, changing the name would be much simpler and less prone to error.

    12 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  16. Let us control whether the Power BI output enables a retention policy inside Power BI

    I believe the current private preview Power BI output inside ASA uses the basicFIFO retention policy inside Power BI described here:
    http://blogs.msdn.com/b/powerbidev/archive/2015/03/23/automatic-retention-policy-for-real-time-data.aspx

    Please enable a configuration option on the Power BI output in ASA so we can disable that retention policy.

    11 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    under review  ·  2 comments  ·  Flag idea as inappropriate…  ·  Admin →
  17. please add feature that could specify file name when output to Azure Blob storage

    at now, Stream Analytics is not support to specify custom file name when output to Azure Blob Storage.
    I hope to specify output file name with timestamp like a "YYYYMMDDMMSS.JSON" when output to Azure Blob Storage.

    11 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Advanced Analytics  ·  Flag idea as inappropriate…  ·  Admin →
  18. Allow Avro Schema Definition in Reference Data Input

    Right now you require Avro schemas to be embedded in the incoming message payload. To reduce payload size, it would be great if you could allow Avro schemas to be defined in Reference Data and versioned. So in my Avro I could say schemaVersion=1.0 and then I can do a look up in my query for that schema and tell ASA to use that schema stream for that message.

    11 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  19. I'm unable to get my query output saved to an azure sql table

    I am unable to get a SQL DB output to work with my analytics job. My job is storing Application Insights export data into a SQL table. The error I'm receiving is that the SQL initialization connection string is invalid. All of the inputs I'm prompted for are correct, and I'm unable to see what connection string the portal is generating for the analytics job. I need assistance determining why the job cannot store output into my table.

    Error details:

    Error:
    - Format of the initialization string does not conform to specification starting at index 87.

    Message:
    Azure SQL Server…

    10 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  20. Input from a redis cache/queue

    It would be great if Azure Redis could be configured as an input source.

    10 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  • Don't see your idea?

Stream Analytics

Categories

Feedback and Knowledge Base