AdminAzure Stream Analytics Team on UserVoice (Product Manager, Microsoft Azure)

My feedback

  1. 1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Stream Analytics  ·  Flag idea as inappropriate…  ·  Admin →
    An error occurred while saving the comment

    Hi, Azure Stream Analytics is optimized for streams of data coming in real time. However we can use some workaround for processing "historical data" coming to blobs.
    By default, Azure Stream Analytics uses the arrival time as timestamp. For blob, you are right this is the LastModified date. However using the keyword TIMESTAMP BY you can choose another timestamp from your payload.
    Note that you will need to extend the late arrival policy in the options so the difference between the actual timestamp and the arrival time is less than the maximum "late arrival time".
    Late arrival can be disabled totally to support reprocessing of historical data by setting the option to "-1", however this is not possible to do this in the portal and you will need to do this either using the API, and Visual Studio (e.g. export the job, edit the number, and republish the job).
    Let us know if you have further question or need additional help.
    JS (Azure Stream Analytics)

    PS: See more info on time policies here: https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-time-handling

  2. 30 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    4 comments  ·  Stream Analytics  ·  Flag idea as inappropriate…  ·  Admin →
    An error occurred while saving the comment

    Can you achieve this by using a Javascript UDF? We will soon be releasing C# UDFs as well. Would these options meet your requirements?

  3. 11 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Stream Analytics » Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  4. 3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Stream Analytics » Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
    An error occurred while saving the comment

    Hi, this is a supported scenario. Do you have any issue when running your job? Can you share more info here?

  5. 9 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Stream Analytics » Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  6. 27 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Stream Analytics » Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  7. 23 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    started  ·  2 comments  ·  Stream Analytics » Inputs and Outputs  ·  Flag idea as inappropriate…  ·  Admin →
  8. 31 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    under review  ·  2 comments  ·  Stream Analytics » Query Language  ·  Flag idea as inappropriate…  ·  Admin →
    An error occurred while saving the comment

    The ROUND method had been added to the product. The documentation is here: https://docs.microsoft.com/en-us/stream-analytics-query/round-azure-stream-analytics

    I'll rename this suggestion to track the "split" function that is not yet available.

  9. 122 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    7 comments  ·  Stream Analytics  ·  Flag idea as inappropriate…  ·  Admin →
    An error occurred while saving the comment

    Thanks for the feedbacks. In order to reduce cost, it is possible to use 1 SU for small jobs (this is about $80 per month).
    1 SU jobs are able to have different sources, sinks and query steps. So it's actually possible to pack various queries inside 1 job.

  10. 36 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    under review  ·  2 comments  ·  Stream Analytics  ·  Flag idea as inappropriate…  ·  Admin →
    An error occurred while saving the comment

    By user timestamped reference data, the reference data will be refreshed when the job is running.

  11. 38 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    3 comments  ·  Stream Analytics » Security  ·  Flag idea as inappropriate…  ·  Admin →
    An error occurred while saving the comment
  12. 3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Stream Analytics  ·  Flag idea as inappropriate…  ·  Admin →
    An error occurred while saving the comment

    A workaround for this would be to create 2 jobs:
    - A first job for the critical output (keeping the error policy to "retry")
    - A second job for non-critical output (changing the policy to "drop")

  13. 63 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    3 comments  ·  Stream Analytics  ·  Flag idea as inappropriate…  ·  Admin →

    Reopening this item since we have additional votes and heard about more comments related to this.
    Few comments/workaround in the meantime:
    - For PII/HBI data: ASA on IoT Edge and Azure Stack for PII – this allows pre-processing at the Edge without sending data to the cloud.
    - Streaming + DB can work side-by-side on ASDE (Azure SQL Database Edge)

    An error occurred while saving the comment

    Thank you for your interest in using Stream Analytics. We are looking for more details on this scenario. For those of you that are interested in an input for On-prem SQL, are you looking to use the SQL data as reference data to be joined with another incoming data stream? Or looking to using the SQL data as a data stream? In the reference data case, you can use Azure Data Factory to take periodic snapshots of the relevant reference data and copy it to blobs that can then be used by Azure Stream Analytics.

    For those of you looking for a direct output to On-Prem SQL, while we don't yet support this scenario directly, it is possible as a workaround to use Azure SQL DB as an output from Stream Analytics and setup a scheduled data move using Azure Data Factory to move the data from Azure DB to your On-Prem SQL database and accomplish a similar pattern.

  14. 1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Stream Analytics  ·  Flag idea as inappropriate…  ·  Admin →
    An error occurred while saving the comment

    Thank you for reaching out Hamid.

    Can you clarify? Are you suggesting that Azure Stream Analytics directly pull from sources such as the Twitter firehose? What would pulling from a device mean?

    Thanks!

  15. 56 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    4 comments  ·  Stream Analytics  ·  Flag idea as inappropriate…  ·  Admin →
    An error occurred while saving the comment

    Does DateTimeFromParts work for you?

    https://msdn.microsoft.com/en-us/library/azure/dn835025.aspx

    If not, then please let us know why not.

  16. 1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    2 comments  ·  Stream Analytics  ·  Flag idea as inappropriate…  ·  Admin →
    An error occurred while saving the comment

    Thank you Gökhan for the reply and valuable feedback. Let us look into this suggestion.

  17. 13 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    2 comments  ·  Stream Analytics  ·  Flag idea as inappropriate…  ·  Admin →
    An error occurred while saving the comment

    While we are considering further output targets, please note that we support Azure Storage Tables and Azure DocumentDB today - both of which qualify as NoSQL stores.

  18. 4 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Stream Analytics  ·  Flag idea as inappropriate…  ·  Admin →
    An error occurred while saving the comment

    ASA does not support an outer join with reference data today.

  19. 28 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    5 comments  ·  Stream Analytics  ·  Flag idea as inappropriate…  ·  Admin →
    An error occurred while saving the comment

    Currently, you can drop old events by using the Late Arrival Policy on the job's Configure tab. Does that address your use case? If not, please elaborate.

  20. 25 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    under review  ·  5 comments  ·  Stream Analytics  ·  Flag idea as inappropriate…  ·  Admin →
    An error occurred while saving the comment

    Thank you for the suggestion Torsten!

    Generating unique values is in conflict with ASA's deterministic results principle. Can you explain your scenario in some detail so that we can see how it could be tackled?

← Previous 1

Feedback and Knowledge Base