Update: Microsoft will be moving away from UserVoice sites on a product-by-product basis throughout the 2021 calendar year. We will leverage 1st party solutions for customer feedback. Learn more here.

Anonymous

My feedback

  1. 88 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    12 comments  ·  Azure Functions  ·  Flag idea as inappropriate…  ·  Admin →

    Here’s the latest as there seem to be 2 types of ask here and so two seperate updates. Need comments for if this issue should close to be focused on one or other:

    1. I want to control how many calls my function can make to another API (the 3rd party API rate limiting).
    – In all plans we now have a way to specify the max instances. This can limit how far a function app instance can scale: https://docs.microsoft.com/en-us/azure/azure-functions/functions-scale#limit-scale-out

    2. I want to stop my function from triggering more than x times an hour.
    Nothing planned in this in the short term. Using API Management for HTTP functions with throttles would be our recommendation for HTTP, nothing out of box for non-HTTP triggers yet.

    An error occurred while saving the comment
    Anonymous commented  · 

    Today, a function app can scale out to up to 200 servers and there's no way to limit that. That's a problem when your functions are connecting to connection limited resources such as SQL DBs. When a Queue triggered function connects to a SQL DB to perform a task, having 200 servers fire up the same function (even when batchSize=1) creates a lot of connections to the DB. This means you need to configure your DB to a more expensive plan to allow it. In my case, even 4000DTUs weren't enough since my app has 5 connection expensive functions and the queue had >40K messages. It'd be useful if I could limit it to 30 servers and not have to pay for a super expensive DB.

    Anonymous supported this idea  · 

Feedback and Knowledge Base