At present, Stream analytics throws memory exception when there is not enough streaming units. It should automatically scale based on the streaming unit requirement.267 votesneed-feedback · AdminAzure Stream Analytics Team on UserVoice (Product Manager, Microsoft Azure) responded
You can autoscale ASA jobs based on pre-defined schedule or dynamically based on input load. This can be setup using Azure Automation. You can learn more about how to do this here: https://aka.ms/asaautoscale
Ability to define functions in python as part of a stream query similar to what has been done with Azure Machine Learning. The ability to then return the results back into the query would be great. Treat it like a SQL Stored Proc almost.37 votesneed-feedback · AdminAzure Stream Analytics Team on UserVoice (Product Manager, Microsoft Azure) responded
You can define Machine Learning UDF in your ASA jobs. With this, any of your python models built using popular frameworks like Tensorflow, scikit-learn, pytorch etc can be deployed on Azure Kubernetes and made available as a function in your job.
- Don't see your idea?