Stream Analytics

Azure Stream Analytics is a fully managed real-time stream computation service hosted in Microsoft Azure, which provides highly resilient, low latency, and scalable complex event processing of streaming data. Azure Stream Analytics enables developers to easily combine streams of data with historic records or reference data to derive business insights easily and quickly.

How can we improve Microsoft Azure Stream Analytics?

You've used all your votes and won't be able to post a new idea, but you can still search and comment on existing ideas.

There are two ways to get more votes:

  • When an admin closes an idea you've voted on, you'll get your votes back from that idea.
  • You can remove your votes from an open idea you support.
  • To see ideas you have already voted on, select the "My feedback" filter and select "My open ideas".
(thinking…)

Enter your idea and we'll search to see if someone has already suggested it.

If a similar idea already exists, you can support and comment on it.

If it doesn't exist, you can post your idea so others can support it.

Enter your idea and we'll search to see if someone has already suggested it.

  1. “Output contains multiple rows …” warning

    I'm not totally sure I'm right here; please let me know if this should go anywhere else. I posted a question on Stackoverflow yesterday (https://stackoverflow.com/questions/45979008/azure-stream-analytics-output-contains-multiple-rows-warning) and was redirected here.

    We're using a Stream Analytics component in Azure to send data (log messages from different web apps) to a table storage account. The messages are retrieved from an Event Hub, but I think this doesn't matter here.

    Within the Stream Analytics component we defined an output for the table storage account including partition and row key settings. As of now the partition key will be the name of the…

    1 vote
    Vote
    Sign in
    Check!
    (thinking…)
    Reset
    or sign in with
    • facebook
    • google
      Password icon
      I agree to the terms of service
      Signed in as (Sign out)
      You have left! (?) (thinking…)
      1 comment  ·  Flag idea as inappropriate…  ·  Admin →
    • Add ability to use something other than time for windowing

      It would be great to provide functionality where you can window (tumbling, sliding or hopping) on something in the input stream other than time.

      Having the ability to compute aggregates based on a different item in the input stream opens up other opportunities.

      Currently I have to do this in another processing engine which removes the need to use Stream Analytics in the solution.

      1 vote
      Vote
      Sign in
      Check!
      (thinking…)
      Reset
      or sign in with
      • facebook
      • google
        Password icon
        I agree to the terms of service
        Signed in as (Sign out)
        You have left! (?) (thinking…)
        0 comments  ·  Flag idea as inappropriate…  ·  Admin →
      • timestamp by duplicates

        How custom timestamp by deal with duplicate events?

        1 vote
        Vote
        Sign in
        Check!
        (thinking…)
        Reset
        or sign in with
        • facebook
        • google
          Password icon
          I agree to the terms of service
          Signed in as (Sign out)
          You have left! (?) (thinking…)
          0 comments  ·  Flag idea as inappropriate…  ·  Admin →
        • Have a smaller unit size (i.e. for lab/demo environments)

          Stream Analytics is great - but it's a bit much to pay for a full unit hour when you have very low processing requirements. I've got a couple demo and lab environments I'm working with now and have to keep stopping the jobs as each costs about $80/month to run.

          The simplest fix for this would be to have a smaller unit size ("shared" unit?) to make it easier to get started with projects.

          8 votes
          Vote
          Sign in
          Check!
          (thinking…)
          Reset
          or sign in with
          • facebook
          • google
            Password icon
            I agree to the terms of service
            Signed in as (Sign out)
            You have left! (?) (thinking…)
            0 comments  ·  Flag idea as inappropriate…  ·  Admin →
          • save as parquet for serverless datapipeline scenario

            If Streaming Analytics job allows reading csv data from a blob storage and then save to Azure Data Lake Store in Parquet format that would be awesome. I dont have to spin up HDInsight and in a Serverless way run a continous job and create a data pipeline for data conversion.

            4 votes
            Vote
            Sign in
            Check!
            (thinking…)
            Reset
            or sign in with
            • facebook
            • google
              Password icon
              I agree to the terms of service
              Signed in as (Sign out)
              You have left! (?) (thinking…)
              0 comments  ·  Flag idea as inappropriate…  ·  Admin →
            • High Resource Error Message

              During job execution for message on high resource utilization, the level shouldn't be "Informational" but Critical since it will lead to a delay or failed execution

              10 votes
              Vote
              Sign in
              Check!
              (thinking…)
              Reset
              or sign in with
              • facebook
              • google
                Password icon
                I agree to the terms of service
                Signed in as (Sign out)
                You have left! (?) (thinking…)
                0 comments  ·  Flag idea as inappropriate…  ·  Admin →
              • visual studio 2017 plugin

                Please add VS 2017 plugin

                1 vote
                Vote
                Sign in
                Check!
                (thinking…)
                Reset
                or sign in with
                • facebook
                • google
                  Password icon
                  I agree to the terms of service
                  Signed in as (Sign out)
                  You have left! (?) (thinking…)
                  1 comment  ·  Flag idea as inappropriate…  ·  Admin →
                • THSI APPLICATION iS BROKEN

                  IT CANNOT FORM AZUR WEB BLOB STORAGE TO HOLD THE JOBS

                  1 vote
                  Vote
                  Sign in
                  Check!
                  (thinking…)
                  Reset
                  or sign in with
                  • facebook
                  • google
                    Password icon
                    I agree to the terms of service
                    Signed in as (Sign out)
                    You have left! (?) (thinking…)
                    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
                  • Input from Azure Data Lake Storage

                    Data Lake is not added as output which is great. But it should be added as input as well. Eg. we continuously save data to Data Lake Storage, and want to stream these to Power BI.

                    6 votes
                    Vote
                    Sign in
                    Check!
                    (thinking…)
                    Reset
                    or sign in with
                    • facebook
                    • google
                      Password icon
                      I agree to the terms of service
                      Signed in as (Sign out)
                      You have left! (?) (thinking…)
                      1 comment  ·  Flag idea as inappropriate…  ·  Admin →
                    • There should be Boolean support between Azure Stream Analytics and the Machine Learning Services

                      The Stream Analytic Job doesn't support Boolean variables as input (they are treated as strings), I had to transform them into viable inputs in the ML Studio, its not a breakable problem but it took me a while to understand that my machine learning service was expecting a Boolean and that the stream job wasn't able to provide it.

                      4 votes
                      Vote
                      Sign in
                      Check!
                      (thinking…)
                      Reset
                      or sign in with
                      • facebook
                      • google
                        Password icon
                        I agree to the terms of service
                        Signed in as (Sign out)
                        You have left! (?) (thinking…)
                        0 comments  ·  Flag idea as inappropriate…  ·  Admin →
                      • I would prefer for a given windowing mechanism, to operate across rows that are extracted from the inputs

                        Today a UDF function operates on row by row it would be good to have a udf that can created and operate across rows. This enbles in writing more powerful function.

                        1 vote
                        Vote
                        Sign in
                        Check!
                        (thinking…)
                        Reset
                        or sign in with
                        • facebook
                        • google
                          Password icon
                          I agree to the terms of service
                          Signed in as (Sign out)
                          You have left! (?) (thinking…)
                          0 comments  ·  Flag idea as inappropriate…  ·  Admin →
                        • Allow Creation of Azure Storage outputs with Shared Access Signatures

                          Currently creating an output to an Azure Storage account requires the account name and key.

                          It would be really useful to be able to create those outputs using a SAS URL/Token to avoid unneeded sharing of keys either when they are recycled or when creating a new Stream Analytics output.

                          9 votes
                          Vote
                          Sign in
                          Check!
                          (thinking…)
                          Reset
                          or sign in with
                          • facebook
                          • google
                            Password icon
                            I agree to the terms of service
                            Signed in as (Sign out)
                            You have left! (?) (thinking…)
                            0 comments  ·  Flag idea as inappropriate…  ·  Admin →
                          • Allow Avro Schema Definition in Reference Data Input

                            Right now you require Avro schemas to be embedded in the incoming message payload. To reduce payload size, it would be great if you could allow Avro schemas to be defined in Reference Data and versioned. So in my Avro I could say schemaVersion=1.0 and then I can do a look up in my query for that schema and tell ASA to use that schema stream for that message.

                            7 votes
                            Vote
                            Sign in
                            Check!
                            (thinking…)
                            Reset
                            or sign in with
                            • facebook
                            • google
                              Password icon
                              I agree to the terms of service
                              Signed in as (Sign out)
                              You have left! (?) (thinking…)
                              0 comments  ·  Flag idea as inappropriate…  ·  Admin →
                            • Query editor performance problems

                              When I open the Query Editor in a Stream Analytics job, my CPU usage spikes and the editor performance can often be measured on the order of _seconds_ per keystroke/mouse click. This reproduces across Edge, IE, and Waterfox.

                              Attached is an edge perf trace of one such incident. At ~2.5s, the browser's FPS goes to 0. It remains that way until after almost the 25s mark. A *brief* analysis indicates that it's an issue inside the disposal portion of merge.

                              1 vote
                              Vote
                              Sign in
                              Check!
                              (thinking…)
                              Reset
                              or sign in with
                              • facebook
                              • google
                                Password icon
                                I agree to the terms of service
                                Signed in as (Sign out)
                                You have left! (?) (thinking…)
                                0 comments  ·  Flag idea as inappropriate…  ·  Admin →
                              • Allow TumblingWindow, HoppingWindow and SlidingWindow constructs to use dynamic values

                                Allow TumblingWindow, HoppingWindow and SlidingWindow constructs to use dynamic values. That is allow these constructs to use a value from the query.

                                SlidingWindow(MINUTE, casedata.slidingwindow)

                                4 votes
                                Vote
                                Sign in
                                Check!
                                (thinking…)
                                Reset
                                or sign in with
                                • facebook
                                • google
                                  Password icon
                                  I agree to the terms of service
                                  Signed in as (Sign out)
                                  You have left! (?) (thinking…)
                                  0 comments  ·  Flag idea as inappropriate…  ·  Admin →
                                • Error message says user defined functions must start with UDF, when actually Machine Learning Functions do now

                                  So while it could be argued that an Azure ML is or is not a UDF, if a user miss types the name of the AML function, it says user defined functions require udf.

                                  1 vote
                                  Vote
                                  Sign in
                                  Check!
                                  (thinking…)
                                  Reset
                                  or sign in with
                                  • facebook
                                  • google
                                    Password icon
                                    I agree to the terms of service
                                    Signed in as (Sign out)
                                    You have left! (?) (thinking…)
                                    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
                                  • Make Stream Analitics fully configurable from ARM template

                                    Now on a portal we can easily configure setting from output/input and the UI is nice. But we don't find the way to configure input from IoT hub using arm only.

                                    In quick state templates we neither input, nor output not demonstrated now https://github.com/Azure/azure-quickstart-templates/tree/master/101-streamanalytics-create

                                    We already requested more templates with examples on GitHub https://github.com/Azure/azure-quickstart-templates/issues/2635 , but maybe somebody will read this request here too.

                                    Example of my issues - I can't find the way to set SharedAccessPolicyKey from IoTHub in ASA

                                    "inputs": [
                                    {
                                    "Name": "IoTHub",
                                    "Properties": {
                                    "DataSource": {
                                    "Properties": {
                                    "ConsumerGroupName": "$Default",
                                    "IotHubNamespace": "igorsychhub",
                                    "SharedAccessPolicyKey": null,
                                    "SharedAccessPolicyName": "iothubowner" …

                                    45 votes
                                    Vote
                                    Sign in
                                    Check!
                                    (thinking…)
                                    Reset
                                    or sign in with
                                    • facebook
                                    • google
                                      Password icon
                                      I agree to the terms of service
                                      Signed in as (Sign out)
                                      You have left! (?) (thinking…)
                                      1 comment  ·  Flag idea as inappropriate…  ·  Admin →
                                    • Give a decent error message instead of "Stream Analytics job has validation errors: The given key was not present in the dictionary."

                                      Any time there is some kind of dictionary, you need to tell us WHICH dictionary and WHAT they key was. Otherwise, the error is completely useless. You might as well say, "Something went wrong".

                                      I've gone over all of my field names from input, query and output. They all match. So far, ASA has been pretty good with errors but this is terrible.

                                      3 votes
                                      Vote
                                      Sign in
                                      Check!
                                      (thinking…)
                                      Reset
                                      or sign in with
                                      • facebook
                                      • google
                                        Password icon
                                        I agree to the terms of service
                                        Signed in as (Sign out)
                                        You have left! (?) (thinking…)
                                        0 comments  ·  Flag idea as inappropriate…  ·  Admin →
                                      • Provide pass-through support for ISO8601 input strings (retain offset)

                                        At present any input data that includes an ISO8601 date object in the format:

                                        Inbound: 2017-03-06T04:05:50.550+11:00 (Local + UTC Offset)

                                        Outbound: 2017-03-05T17:05:50.5500000 (UTC)

                                        As a result I no longer have the ability to determine origination timezone which is an issue.

                                        15 votes
                                        Vote
                                        Sign in
                                        Check!
                                        (thinking…)
                                        Reset
                                        or sign in with
                                        • facebook
                                        • google
                                          Password icon
                                          I agree to the terms of service
                                          Signed in as (Sign out)
                                          You have left! (?) (thinking…)
                                          0 comments  ·  Flag idea as inappropriate…  ·  Admin →
                                        • Support for custom aggregation functions

                                          Support for custom aggregation functions

                                          Reason:

                                          Arithmetic average is not always the correct way to calculate an average.
                                          For example to calculate the average wind direction a vector calculation is required.
                                          Below is the calculation (using Python) outlined as an example.

                                          # Calculate avg wind direction
                                          sa = sum(self.wind_speed_buffer.partial * np.sin(self.wind_dir_buffer.partial)) / self.wind_dir_buffer.size
                                          ca = sum(self.wind_speed_buffer.partial * np.cos(self.wind_dir_buffer.partial)) / self.wind_dir_buffer.size
                                          avg_a = np.degrees(np.arctan2(sa, ca))

                                          Today it is not possible to do this sort of calculation in Stream Analytics. Also user defined functions or Azure Functions can't be used as both are stateless and cant't store the intermediate values required to…

                                          1 vote
                                          Vote
                                          Sign in
                                          Check!
                                          (thinking…)
                                          Reset
                                          or sign in with
                                          • facebook
                                          • google
                                            Password icon
                                            I agree to the terms of service
                                            Signed in as (Sign out)
                                            You have left! (?) (thinking…)
                                            1 comment  ·  Flag idea as inappropriate…  ·  Admin →
                                          ← Previous 1 3 4 5 10 11
                                          • Don't see your idea?

                                          Stream Analytics

                                          Feedback and Knowledge Base