Stream Analytics

Azure Stream Analytics is a fully managed real-time stream computation service hosted in Microsoft Azure, which provides highly resilient, low latency, and scalable complex event processing of streaming data. Azure Stream Analytics enables developers to easily combine streams of data with historic records or reference data to derive business insights easily and quickly.

How can we improve Microsoft Azure Stream Analytics?

You've used all your votes and won't be able to post a new idea, but you can still search and comment on existing ideas.

There are two ways to get more votes:

  • When an admin closes an idea you've voted on, you'll get your votes back from that idea.
  • You can remove your votes from an open idea you support.
  • To see ideas you have already voted on, select the "My feedback" filter and select "My open ideas".
(thinking…)

Enter your idea and we'll search to see if someone has already suggested it.

If a similar idea already exists, you can support and comment on it.

If it doesn't exist, you can post your idea so others can support it.

Enter your idea and we'll search to see if someone has already suggested it.

  1. Allow Creation of Azure Storage outputs with Shared Access Signatures

    Currently creating an output to an Azure Storage account requires the account name and key.

    It would be really useful to be able to create those outputs using a SAS URL/Token to avoid unneeded sharing of keys either when they are recycled or when creating a new Stream Analytics output.

    12 votes
    Vote
    Sign in
    Check!
    (thinking…)
    Reset
    or sign in with
    • facebook
    • google
      Password icon
      I agree to the terms of service
      Signed in as (Sign out)
      You have left! (?) (thinking…)
      0 comments  ·  Flag idea as inappropriate…  ·  Admin →
    • Allow Avro Schema Definition in Reference Data Input

      Right now you require Avro schemas to be embedded in the incoming message payload. To reduce payload size, it would be great if you could allow Avro schemas to be defined in Reference Data and versioned. So in my Avro I could say schemaVersion=1.0 and then I can do a look up in my query for that schema and tell ASA to use that schema stream for that message.

      4 votes
      Vote
      Sign in
      Check!
      (thinking…)
      Reset
      or sign in with
      • facebook
      • google
        Password icon
        I agree to the terms of service
        Signed in as (Sign out)
        You have left! (?) (thinking…)
        0 comments  ·  Flag idea as inappropriate…  ·  Admin →
      • Query editor performance problems

        When I open the Query Editor in a Stream Analytics job, my CPU usage spikes and the editor performance can often be measured on the order of _seconds_ per keystroke/mouse click. This reproduces across Edge, IE, and Waterfox.

        Attached is an edge perf trace of one such incident. At ~2.5s, the browser's FPS goes to 0. It remains that way until after almost the 25s mark. A *brief* analysis indicates that it's an issue inside the disposal portion of merge.

        1 vote
        Vote
        Sign in
        Check!
        (thinking…)
        Reset
        or sign in with
        • facebook
        • google
          Password icon
          I agree to the terms of service
          Signed in as (Sign out)
          You have left! (?) (thinking…)
          0 comments  ·  Flag idea as inappropriate…  ·  Admin →
        • Allow TumblingWindow, HoppingWindow and SlidingWindow constructs to use dynamic values

          Allow TumblingWindow, HoppingWindow and SlidingWindow constructs to use dynamic values. That is allow these constructs to use a value from the query.

          SlidingWindow(MINUTE, casedata.slidingwindow)

          1 vote
          Vote
          Sign in
          Check!
          (thinking…)
          Reset
          or sign in with
          • facebook
          • google
            Password icon
            I agree to the terms of service
            Signed in as (Sign out)
            You have left! (?) (thinking…)
            0 comments  ·  Flag idea as inappropriate…  ·  Admin →
          • Error message says user defined functions must start with UDF, when actually Machine Learning Functions do now

            So while it could be argued that an Azure ML is or is not a UDF, if a user miss types the name of the AML function, it says user defined functions require udf.

            1 vote
            Vote
            Sign in
            Check!
            (thinking…)
            Reset
            or sign in with
            • facebook
            • google
              Password icon
              I agree to the terms of service
              Signed in as (Sign out)
              You have left! (?) (thinking…)
              0 comments  ·  Flag idea as inappropriate…  ·  Admin →
            • Make Stream Analitics fully configurable from ARM template

              Now on a portal we can easily configure setting from output/input and the UI is nice. But we don't find the way to configure input from IoT hub using arm only.

              In quick state templates we neither input, nor output not demonstrated now https://github.com/Azure/azure-quickstart-templates/tree/master/101-streamanalytics-create

              We already requested more templates with examples on GitHub https://github.com/Azure/azure-quickstart-templates/issues/2635 , but maybe somebody will read this request here too.

              Example of my issues - I can't find the way to set SharedAccessPolicyKey from IoTHub in ASA

              "inputs": [
              {
              "Name": "IoTHub",
              "Properties": {
              "DataSource": {
              "Properties": {
              "ConsumerGroupName": "$Default",
              "IotHubNamespace": "igorsychhub",
              "SharedAccessPolicyKey": null,
              "SharedAccessPolicyName": "iothubowner" …

              18 votes
              Vote
              Sign in
              Check!
              (thinking…)
              Reset
              or sign in with
              • facebook
              • google
                Password icon
                I agree to the terms of service
                Signed in as (Sign out)
                You have left! (?) (thinking…)
                1 comment  ·  Flag idea as inappropriate…  ·  Admin →
              • Give a decent error message instead of "Stream Analytics job has validation errors: The given key was not present in the dictionary."

                Any time there is some kind of dictionary, you need to tell us WHICH dictionary and WHAT they key was. Otherwise, the error is completely useless. You might as well say, "Something went wrong".

                I've gone over all of my field names from input, query and output. They all match. So far, ASA has been pretty good with errors but this is terrible.

                3 votes
                Vote
                Sign in
                Check!
                (thinking…)
                Reset
                or sign in with
                • facebook
                • google
                  Password icon
                  I agree to the terms of service
                  Signed in as (Sign out)
                  You have left! (?) (thinking…)
                  0 comments  ·  Flag idea as inappropriate…  ·  Admin →
                • Provide pass-through support for ISO8601 input strings (retain offset)

                  At present any input data that includes an ISO8601 date object in the format:

                  Inbound: 2017-03-06T04:05:50.550+11:00 (Local + UTC Offset)

                  Outbound: 2017-03-05T17:05:50.5500000 (UTC)

                  As a result I no longer have the ability to determine origination timezone which is an issue.

                  15 votes
                  Vote
                  Sign in
                  Check!
                  (thinking…)
                  Reset
                  or sign in with
                  • facebook
                  • google
                    Password icon
                    I agree to the terms of service
                    Signed in as (Sign out)
                    You have left! (?) (thinking…)
                    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
                  • Support for custom aggregation functions

                    Support for custom aggregation functions

                    Reason:

                    Arithmetic average is not always the correct way to calculate an average.
                    For example to calculate the average wind direction a vector calculation is required.
                    Below is the calculation (using Python) outlined as an example.

                    # Calculate avg wind direction
                    sa = sum(self.wind_speed_buffer.partial * np.sin(self.wind_dir_buffer.partial)) / self.wind_dir_buffer.size
                    ca = sum(self.wind_speed_buffer.partial * np.cos(self.wind_dir_buffer.partial)) / self.wind_dir_buffer.size
                    avg_a = np.degrees(np.arctan2(sa, ca))

                    Today it is not possible to do this sort of calculation in Stream Analytics. Also user defined functions or Azure Functions can't be used as both are stateless and cant't store the intermediate values required to…

                    1 vote
                    Vote
                    Sign in
                    Check!
                    (thinking…)
                    Reset
                    or sign in with
                    • facebook
                    • google
                      Password icon
                      I agree to the terms of service
                      Signed in as (Sign out)
                      You have left! (?) (thinking…)
                      1 comment  ·  Flag idea as inappropriate…  ·  Admin →
                    • Support Power BI embedded as an output

                      As Power BI Embedded allows people to embed dashboards in their applications, it would be nice to have the ability of managing near real-time dashboarding by having ASA being able to push to Power BI Embedded

                      72 votes
                      Vote
                      Sign in
                      Check!
                      (thinking…)
                      Reset
                      or sign in with
                      • facebook
                      • google
                        Password icon
                        I agree to the terms of service
                        Signed in as (Sign out)
                        You have left! (?) (thinking…)
                        1 comment  ·  Flag idea as inappropriate…  ·  Admin →
                      • When creating an ASA job input, the 'value must not be empty' warning for Event Hub Policy Name field is inconsistent

                        * If no policies are created, it defaults to RootManagerSharedAccessKey, and has a warning on the side that 'must not be empty', which disables creation

                        * After updating the policies in another window, this window won't reupdate automatically; my solution was to select a different Service Bus Namespace, then revert the namespace back to the one I wanted. After doing this, the Event Hub Policy Name updates to my newly created policy, but the warning is still there (and create is still disabled)

                        * I can now manually select RootManagerSharedAccessKey, which removes the warning and allows creation

                        * I can…

                        1 vote
                        Vote
                        Sign in
                        Check!
                        (thinking…)
                        Reset
                        or sign in with
                        • facebook
                        • google
                          Password icon
                          I agree to the terms of service
                          Signed in as (Sign out)
                          You have left! (?) (thinking…)
                          0 comments  ·  Flag idea as inappropriate…  ·  Admin →
                        • When creating an ASA job input, prefer a service bus in the same resource group

                          When I 'Create Input' in the new portal for my ASA job, it defaults to the first service bus it finds. If there is a service bus in the same resource group, default to that instead.

                          1 vote
                          Vote
                          Sign in
                          Check!
                          (thinking…)
                          Reset
                          or sign in with
                          • facebook
                          • google
                            Password icon
                            I agree to the terms of service
                            Signed in as (Sign out)
                            You have left! (?) (thinking…)
                            0 comments  ·  Flag idea as inappropriate…  ·  Admin →
                          • 8 votes
                            Vote
                            Sign in
                            Check!
                            (thinking…)
                            Reset
                            or sign in with
                            • facebook
                            • google
                              Password icon
                              I agree to the terms of service
                              Signed in as (Sign out)
                              You have left! (?) (thinking…)
                              0 comments  ·  Flag idea as inappropriate…  ·  Admin →
                            • Add Publish target to asaproj / StreamAnalytics.targets

                              The Visual Studio add-in allows for GUI-driven deployment to Azure via the 'Submit to Azure' button in the ASQL editor.

                              We need a way to drive this from MSBUILD, such as a in continuous integration build script.

                              A 'publish' or 'deploy' target, such as what is available in various ASP.NET project types, would let us to that.

                              15 votes
                              Vote
                              Sign in
                              Check!
                              (thinking…)
                              Reset
                              or sign in with
                              • facebook
                              • google
                                Password icon
                                I agree to the terms of service
                                Signed in as (Sign out)
                                You have left! (?) (thinking…)
                                0 comments  ·  Flag idea as inappropriate…  ·  Admin →
                              • Function to check for point in- or outside a circle

                                For use of geofencing, it would be nice to have a function to check whether a point is inside or outside a circle (or square, or polygon)

                                An extension of ST_WITHIN to accept a point and a circle would do.

                                1 vote
                                Vote
                                Sign in
                                Check!
                                (thinking…)
                                Reset
                                or sign in with
                                • facebook
                                • google
                                  Password icon
                                  I agree to the terms of service
                                  Signed in as (Sign out)
                                  You have left! (?) (thinking…)
                                  1 comment  ·  Flag idea as inappropriate…  ·  Admin →
                                • Add Output to Azure Storage Queue

                                  Allow output to be sent to an azure storage queue. The use case would be to trigger an azure function/signalr from the azure storage queue when a certain condition is found in the stream. Currently we are writing data to azure table storage and if a specific condition is found, we would like to trigger another action or further processing of the row. Right now we have to write to a service bus topic/queue, develop a listener/trigger function application, and then process the record. Enabling storage queue allows us to use only 1 storage account to hold the raw data,…

                                  6 votes
                                  Vote
                                  Sign in
                                  Check!
                                  (thinking…)
                                  Reset
                                  or sign in with
                                  • facebook
                                  • google
                                    Password icon
                                    I agree to the terms of service
                                    Signed in as (Sign out)
                                    You have left! (?) (thinking…)
                                    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
                                  • TimeZone support

                                    It would be great to have TimeZone support. I need to report my event trends in Pacific Time. There is no way to automate this currently even with JavaScript UDFs.

                                    7 votes
                                    Vote
                                    Sign in
                                    Check!
                                    (thinking…)
                                    Reset
                                    or sign in with
                                    • facebook
                                    • google
                                      Password icon
                                      I agree to the terms of service
                                      Signed in as (Sign out)
                                      You have left! (?) (thinking…)
                                      0 comments  ·  Flag idea as inappropriate…  ·  Admin →
                                    • Parquet Support

                                      Would be great to support Parquet file format for output from Stream Analytics. This format seems to be trending up in popularity within the Spark and Hadoop ecosystem.

                                      5 votes
                                      Vote
                                      Sign in
                                      Check!
                                      (thinking…)
                                      Reset
                                      or sign in with
                                      • facebook
                                      • google
                                        Password icon
                                        I agree to the terms of service
                                        Signed in as (Sign out)
                                        You have left! (?) (thinking…)
                                        0 comments  ·  Flag idea as inappropriate…  ·  Admin →
                                      • Scaling up Streaming units should be transparant and not require the job to be stopped

                                        Currently adjusting the streaming units requires the job to be stopped within the portal. It would be very beneficial if the scale could be adjusted on the fly and not require a job stop/start.

                                        4 votes
                                        Vote
                                        Sign in
                                        Check!
                                        (thinking…)
                                        Reset
                                        or sign in with
                                        • facebook
                                        • google
                                          Password icon
                                          I agree to the terms of service
                                          Signed in as (Sign out)
                                          You have left! (?) (thinking…)
                                          0 comments  ·  Flag idea as inappropriate…  ·  Admin →
                                        • Support decompress content like gZip JSON

                                          Allow to decompress input content.
                                          Please see attachement picture....

                                          40 votes
                                          Vote
                                          Sign in
                                          Check!
                                          (thinking…)
                                          Reset
                                          or sign in with
                                          • facebook
                                          • google
                                            Password icon
                                            I agree to the terms of service
                                            Signed in as (Sign out)
                                            You have left! (?) (thinking…)
                                            0 comments  ·  Flag idea as inappropriate…  ·  Admin →
                                          ← Previous 1 3 4 5 10 11
                                          • Don't see your idea?

                                          Stream Analytics

                                          Feedback and Knowledge Base