AdminAzure Stream Analytics Team on UserVoice
(Product Manager, Microsoft Azure)
My feedback
-
1 vote
An error occurred while saving the comment -
1 vote
An error occurred while saving the comment Hi, Azure Stream Analytics is optimized for streams of data coming in real time. However we can use some workaround for processing "historical data" coming to blobs.
By default, Azure Stream Analytics uses the arrival time as timestamp. For blob, you are right this is the LastModified date. However using the keyword TIMESTAMP BY you can choose another timestamp from your payload.
Note that you will need to extend the late arrival policy in the options so the difference between the actual timestamp and the arrival time is less than the maximum "late arrival time".
Late arrival can be disabled totally to support reprocessing of historical data by setting the option to "-1", however this is not possible to do this in the portal and you will need to do this either using the API, and Visual Studio (e.g. export the job, edit the number, and republish the job).
Let us know if you have further question or need additional help.
JS (Azure Stream Analytics)PS: See more info on time policies here: https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-time-handling
-
33 votes
An error occurred while saving the comment Can you achieve this by using a Javascript UDF? We will soon be releasing C# UDFs as well. Would these options meet your requirements?
-
24 votes
-
3 votes
An error occurred while saving the comment Hi, this is a supported scenario. Do you have any issue when running your job? Can you share more info here?
-
9 votes
-
30 votes
-
23 votes
-
31 votesunder review · 2 comments · Stream Analytics » Query Language · Flag idea as inappropriate… · Admin →
An error occurred while saving the comment The ROUND method had been added to the product. The documentation is here: https://docs.microsoft.com/en-us/stream-analytics-query/round-azure-stream-analytics
I'll rename this suggestion to track the "split" function that is not yet available.
-
127 votesunder review ·
AdminAzure Stream Analytics Team on UserVoice (Product Manager, Microsoft Azure) responded
we will consider this request.
An error occurred while saving the comment Thanks for the feedbacks. In order to reduce cost, it is possible to use 1 SU for small jobs (this is about $80 per month).
1 SU jobs are able to have different sources, sinks and query steps. So it's actually possible to pack various queries inside 1 job. -
36 votes
An error occurred while saving the comment By user timestamped reference data, the reference data will be refreshed when the job is running.
-
41 votes
An error occurred while saving the comment We announced MSI auth for ADLS: https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-managed-identities-adls
More to come... -
3 votes
An error occurred while saving the comment A workaround for this would be to create 2 jobs:
- A first job for the critical output (keeping the error policy to "retry")
- A second job for non-critical output (changing the policy to "drop") -
63 votesunder review ·
AdminAzure Stream Analytics Team on UserVoice (Product Manager, Microsoft Azure) responded
Reopening this item since we have additional votes and heard about more comments related to this.
Few comments/workaround in the meantime:
- For PII/HBI data: ASA on IoT Edge and Azure Stack for PII – this allows pre-processing at the Edge without sending data to the cloud.
- Streaming + DB can work side-by-side on ASDE (Azure SQL Database Edge)An error occurred while saving the comment Thank you for your interest in using Stream Analytics. We are looking for more details on this scenario. For those of you that are interested in an input for On-prem SQL, are you looking to use the SQL data as reference data to be joined with another incoming data stream? Or looking to using the SQL data as a data stream? In the reference data case, you can use Azure Data Factory to take periodic snapshots of the relevant reference data and copy it to blobs that can then be used by Azure Stream Analytics.
For those of you looking for a direct output to On-Prem SQL, while we don't yet support this scenario directly, it is possible as a workaround to use Azure SQL DB as an output from Stream Analytics and setup a scheduled data move using Azure Data Factory to move the data from Azure DB to your On-Prem SQL database and accomplish a similar pattern.
-
1 vote
An error occurred while saving the comment Thank you for reaching out Hamid.
Can you clarify? Are you suggesting that Azure Stream Analytics directly pull from sources such as the Twitter firehose? What would pulling from a device mean?
Thanks!
-
56 votes
An error occurred while saving the comment Does DateTimeFromParts work for you?
https://msdn.microsoft.com/en-us/library/azure/dn835025.aspx
If not, then please let us know why not.
-
1 voteunder review ·
AdminAzure Stream Analytics Team on UserVoice (Product Manager, Microsoft Azure) responded
Relay has its own characteristics and advantages, but you may be able to use Servicebus Topics or Queues instead, which are supported by ASA today.
An error occurred while saving the comment Thank you Gökhan for the reply and valuable feedback. Let us look into this suggestion.
-
13 votes
An error occurred while saving the comment While we are considering further output targets, please note that we support Azure Storage Tables and Azure DocumentDB today - both of which qualify as NoSQL stores.
-
4 votes
An error occurred while saving the comment ASA does not support an outer join with reference data today.
-
28 votesunder review ·
AdminAzure Stream Analytics Team on UserVoice (Product Manager, Microsoft Azure) responded
You can use late arrival policy (using the Drop option) to filter out old events.
An error occurred while saving the comment Currently, you can drop old events by using the Late Arrival Policy on the job's Configure tab. Does that address your use case? If not, please elaborate.
Hi we don't have a fixed set of IPs, but one new option is to use private endpoints on a stream analytics cluster. See info here: https://docs.microsoft.com/en-us/azure/stream-analytics/private-endpoints