Stream Analytics

Azure Stream Analytics is a fully managed real-time stream computation service hosted in Microsoft Azure, which provides highly resilient, low latency, and scalable complex event processing of streaming data. Azure Stream Analytics enables developers to easily combine streams of data with historic records or reference data to derive business insights easily and quickly.

  • Hot ideas
  • Top ideas
  • New ideas
  • My feedback
  1. How does one go about getting the client IP address for event data read from an event hub?

    We're sending event data to an event hub but we want to know the IP address of the client that sent the event. How do we get access to this? In particular, we want to see this output from a query in Stream Analytics?

    9 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  2. Scaling up Streaming units should be transparant and not require the job to be stopped

    Currently adjusting the streaming units requires the job to be stopped within the portal. It would be very beneficial if the scale could be adjusted on the fly and not require a job stop/start.

    8 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  3. Let us control whether the Power BI output enables a retention policy inside Power BI

    I believe the current private preview Power BI output inside ASA uses the basicFIFO retention policy inside Power BI described here:
    http://blogs.msdn.com/b/powerbidev/archive/2015/03/23/automatic-retention-policy-for-real-time-data.aspx

    Please enable a configuration option on the Power BI output in ASA so we can disable that retention policy.

    8 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    under review  ·  2 comments  ·  Flag idea as inappropriate…  ·  Admin →
  4. Input from Azure Data Lake Storage

    Data Lake is not added as output which is great. But it should be added as input as well. Eg. we continuously save data to Data Lake Storage, and want to stream these to Power BI.

    8 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    2 comments  ·  Flag idea as inappropriate…  ·  Admin →
  5. R instead of SQL for Azure Stream Analytics queries

    Microsoft R is a must for queries of stream analytics!

    8 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  6. Define SerializerDeserializer for input reference data from blob storage

    Possibility to define (in some language) a Serializer/Deserializer for proprietary input file formats from reference blob data.

    As it is today I had to write a cloud service that parses the data and send it to an eventhub but if stream analytics could accept any file format this way, provided that I somehow define a parser for my special data format, than I could skip that step.

    7 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    under review  ·  0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  7. I'm unable to get my query output saved to an azure sql table

    I am unable to get a SQL DB output to work with my analytics job. My job is storing Application Insights export data into a SQL table. The error I'm receiving is that the SQL initialization connection string is invalid. All of the inputs I'm prompted for are correct, and I'm unable to see what connection string the portal is generating for the analytics job. I need assistance determining why the job cannot store output into my table.

    Error details:

    Error:
    - Format of the initialization string does not conform to specification starting at index 87.

    Message:
    Azure SQL Server…

    7 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  8. Azure Stream Analytics Data Lake Store output file name

    It would be nice to add the ability to provide a file name prefix to the output file for ADLS. If I have multiple ASA jobs pushing data to the same folder, I cannot determine the contents of the file unless I open it. The file name that is generated is of no use.

    7 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  9. date

    Enable dates to be provided in local date time rather than UTC. Useful for outputs to PowerBI to create reports that reflect the local time the data was captured and not UTC

    7 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  10. running aggregates

    I'd like to be able to sum counts from all records since a fixed time in the past, rather than use fixed windows.

    This would be like a sliding window, but with a fixed beginning, and an end that follows the last sample.

    Input stream
    sesID TIME count
    session1 time 1
    session2 time 2
    session2 time 3

    The first step would be to create an subquery that uses LAG to find where the sesID changes, and then store that time.

    That time then could be used as the start of the window, let's call it fixedStartWindow(StartTime, last(time)) and allow group…

    7 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  11. Allow IoT Hub Feedback Endpoint to be defined as Input

    When configuring an IoT Hub as an ASA input, the IoT Hub endpoint that is used for the data stream is the cloud to device endpoint. Given that the delivery Feedback endpoint /messages/devicebound/feedback is an AMQP endpoint it would be nice to be able to process these messages using ASA. Suggestion is that the Portal UI allows a user to specify either the primary device-to-cloud *or* the feedback endpoint when creating an IoT Hub input.

    7 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  12. Search array elements

    Ability to search through array elements.

    My json data is as follows into the eventhub

    {
    "customerid":"ifslabs",
    "deviceid":"car1",
    "starttime":"2015-08-21 08:19:40.362983",
    "eventtime":"2015-08-21 08:19:55.045514",
    "SensorData": [
    {"SensorId":"gps", "SensorType":"latitude", "SensorValue":"1.1"},
    {"SensorId":"gps","SensorType":"longitude","SensorValue":"9.9"},
    {"SensorId":"Fuel System Stat","SensorType":"","SensorValue":"NORESPONSE"},
    {"SensorId":"Calc Load Value","SensorType":"","SensorValue":"NORESPONSE"},
    {"SensorId":"Coolant Temp","SensorType":"F","SensorValue":"NORESPONSE"},
    {"SensorId":"Engine RPM","SensorType":"","SensorValue":"NORESPONSE"},
    {"SensorId":"Vehicle Speed","SensorType":"MPH","SensorValue":"NORESPONSE"},
    {"SensorId":"Intake Air Temp","SensorType":"F","SensorValue":"NORESPONSE"}
    ]
    }

    Using stream analytics function GetArrayElement I can get array elements of SensorData flattend but I want to search through the array and filter only some specific SensorType for example:

    select inputeventhub.starttime as starttime,
    inputeventhub.eventtime as eventtime,
    FilterArrayElement(inputeventhub.SensorData, SensorType, 'latitude') as latitude,
    FilterArrayElement(inputeventhub.SensorData, SensorType, 'longitude') as longitude
    outputcarlocation from inputeventhub

    Using a new function like FilterArrayElement(array,…

    7 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    under review  ·  0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  13. Define a specific partition of a n eventhub as input to stream analytics query

    Define a specific partition of a n eventhub as input to stream analytics query

    7 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    under review  ·  0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  14. (Preview Portal) display JSON format error

    When input data format is set JSON and event data format is not correct , error message isn't displayed in preview portal.
    But Normal portal is displayed warning as the following.

    "Could not deserialize the input event as Json. Some possible reasons: 1) Malformed events 2) Input source configured with incorrect serialization format"

    6 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  15. Add Query Parameters to the SA job configuration settings.

    Create query parameters (stored in configuration settings) for the SA job so that common values can be controlled without modifying the query itself.

    6 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    2 comments  ·  Flag idea as inappropriate…  ·  Admin →
  16. Obfuscation or encryption of query code

    Will it be possible in a future release of Stream Analytics to either obfuscate or encrypt the query (or javascript or C#) code of SA jobs. We are looking into the possibility to implement our SA jobs/models, which we use for predictive maintenance for instance, besides our own subscriptions, in subscriptions of customers as well. But we would not want them to be able to see the code of our queries or, because these contain proprietary logic, rules or definitions.

    6 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  17. Ability to identify reference data from data in the input

    Ability to identify reference data from data in the input, something like a SELECT DISTINCT in SQL

    6 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    under review  ·  1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  18. Provide API App DataSource

    Output of some API App can be used as input data stream in ASA job.

    6 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    under review  ·  0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  19. 6 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  20. 6 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  • Don't see your idea?

Stream Analytics

Categories

Feedback and Knowledge Base