Stream Analytics

Azure Stream Analytics is a fully managed real-time stream computation service hosted in Microsoft Azure, which provides highly resilient, low latency, and scalable complex event processing of streaming data. Azure Stream Analytics enables developers to easily combine streams of data with historic records or reference data to derive business insights easily and quickly.

  • Hot ideas
  • Top ideas
  • New ideas
  • My feedback
  1. My streaming analytics job says FAILED

    ... and i am able to start it again.... But i want to know why it failed, so it won't happen again in the future. It was down for half a day or so, because i didn't notice it. I can't find anything information as to why it failed, not on the new nor old portal

    4 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    3 comments  ·  Flag idea as inappropriate…  ·  Admin →
  2. Allow Azure Data Explorer (ADX) as a reference input

    We leverage Azure data explorer (ADX) as our main reporting database, and as such, we want to be able to use it as an input for reference data (as we store it there)

    4 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  3. Job Diagram

    This Job Diagram holds real promise as visual intuitive view of job execution and metrics especially during development. In it current form it's anything but that. When I select an output it seems to default to out of order events only.

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  4. Allow ASA to acquire bearer token for authenticating itself to Azure Function protected by App Service Authentication

    A possible output sink for an ASA job is Azure Functions. However, if the Function is protected by App Service Authentication (or any type of OAuth authentication), the flow will not work since ASA doesn't have the functionality to fetch bearer tokens.

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Security  ·  Flag idea as inappropriate…  ·  Admin →
  5. For Service bus topic (as output) with session enabled subscription and when set Session Id as System property then the state set as degrade

    For Service bus topic (as output) with session enabled subscription and when set Session Id as System property then the state set as degrade. However, the messages are still passes to those subscriptions but this is really confusing. Please let us know the reason for this

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  6. Failed to Get Framework Assemblies Local Path During Pushing Edge Package??

    I am getting the above error while deploying ASA at edge. Any idea?

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  7. Compression types as multiple option

    Right now, we can set the compression type in asa job as single option. I mean, we can tell asa job to treat all events as gzip, deflate, or none. In real scenario, the events input may be in various formats.
    It would be great to have this settings as multiple options.
    Right now to achieve this you have to create another asa job with different settings, and then both of them have a lot of warnings with wrong deserialization.

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  8. Allow definition of fields outside of stream.

    By definition this data is a stream, adding headers to each row severely limits the applicability, especially in IoT scenarios where device bandwidth may be constrained. We're having to use an intermediary to reformat data before putting it on another event hub just for stream analytics to use it, introducing latency.

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  9. Edit while job running to prevent downtime

    It should be possible to create inputs and outputs (but not edit existing, of course) and prepare a query without having to stop the entire job.

    With the current limitations we get periods of downtime due to having to stop the service while adding new functionality.

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  10. Query version history

    Version history on queries would be nice to prevent accidental deletion and editing.

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  11. Issue in renew-authorisation for stream analytics output - for datalake store

    I am trying to set up (deploy) a stream analytics job with data lake store as output using terraform. Terraform internally call the Azure API (through Azure Go SDK) for stream analytics to set the datalake store output that expects a refresh token.

    As per the API documentation, it is recommended to put a dummy string value here when creating the data source and then going to the Azure Portal to authenticate the data source which will update this property with a valid refresh token. Hence, I passed a "dummy" string as refresh token, and I could see the datalake…

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  12. Allow Functions like CAST after TIMESTAMP BY

    My incoming timestamps have to be converted to ISO 8601 but only columns (and no functions) are allowed after the TIMESTAMP BY statement.
    Please allow all kind of functions in order to provide a conversion. I think some guys here needed also a unix datetime converter :)

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  13. Adding R packages to operate on data

    Would help if the query window would support using R packages for massaging the data the way Azure ML does to support complex analytics

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    under review  ·  0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  14. Output to Azure Postgres

    Similarly how Azure SQL is supported it would be great to have an output connector to Azure Postgres.

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  15. Stream Analytics for IoT Edge removes custom message properties / doesn't allow setting message properties

    Stream Analytics for IoT Edge seems to strip out custom IoT message properties set by other modules, and I can't seem to work out anyway in Stream Analytics Edge to set these properties again.

    I see for Cloud Hosted Stream Jobs you can configure these properties on the Output...but I can't work out how to do this for Edge jobs https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-define-outputs?#custom-metadata-properties-for-output

    I've opened an issue on Github IoT Edge repo with more details https://github.com/Azure/iotedge/issues/2846

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  16. Ability to write the output to Azure File Share

    Currently, I see ASA can only write to blob or database.. but not azure file share. Provide the ability to dump the messages data into Azure File share folders. This should include the following features:
    1. Ability to define the name
    2. Ability to define the operations like create, delete, update as part of file writing.

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  17. Multiple SA Jobs per Visual Studio project

    It would be convenient if it were possible to add multiple SA jobs (asasql) to one project, making it possible to reuse inputs, outputs and javascript functions. This would be even more conventient once c# development is possible with SA jobs.

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  18. Testing query should not require reference data as input

    At present testing a query requires all the input including the "reference data" to be included. But there is a limitation of size of input data. Instead of that, testing should require only the input data stream to be included. Reference data should be taken from the reference blob

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    under review  ·  0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  19. Allow designation of "non-critical" output, for which data can be discarded on overflow

    We use our Stream Analytics jobs in our alerting, to process tens of thousands of events into possible alerts. The stream analytics jobs write to multiple outputs; some of these are critical (containing the alerts), others are non-critical (container .e.g data to produce graphs for debugging/enrichting alerts).

    When the non-critical outputs fail, our job will stop working, thereby interrupting our alerting flow. However, this debug data could simply be discarded if that allows the other outputs to keep functioning.

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  20. Device Command Queue as output

    I would like to have a possibility in ASA job dynamically send output messages. For example, ASA job can determine that some command can be send to some device. If I have 1000-s of devices I have no chance to do this.

    Few suggestions:


    1. Provide IoTHub Command queue as an output

    2. Provide also a possibility to forward messages to subscriptions like commands/device1, where device1 is the id of device.

    In general ASA has a big limitation when dealing with queues (SB).
    ASA queries deal with message payload. By using message payload like DeviceId, we could in a case 2 (see…

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  • Don't see your idea?

Stream Analytics

Categories

Feedback and Knowledge Base