Functions often need password, API keys, and connection strings to connect to other services and retrieve data. It would be great if those secrets could easily be obtained from Key Vault.382 votes
We’re pleased to announce a public preview of our Key Vault references feature, which you can learn more about here: https://azure.microsoft.com/en-us/blog/simplifying-security-for-serverless-and-web-apps-with-azure-functions-and-app-service/
There are some limitations to the initial preview, but we’re hoping to address those very soon. We’re looking forward to your feedback!
Matthew, Azure Functions team
C# and other environments are already supported; it would be nice if Go / Golang was one of them. I'd love to handle web requests using Golang + Azure Functions.
It could provide the "HTTP Requests" via the stdin, while the "HTTP Response" can be given via the stdout. (or whatever those two are called on Windows ;-) )170 votes
Outside of out of the box providers like facebook and google, provide samples and guidance on how to authenticate with others like LinkedIn.146 votes
This item remains unplanned.
Now- any light ddos attack that Azure will not recognize- will affect me and my account. If I know that my service shouldn't receive more than 10000 calls per day, but I can't setup limits on incoming requests.
"Daily Usage Quota (GB-Sec)"- not bad idea, but it's something internal and synthetic for me. Call/per day- is much more native metrics for users.73 votes
Moving this work item to unplanned, as it is clear that this request is no for a global throughput limit.
We do now offer the ability to limit your maximum instances in the Premium plan, which will allow you to avoid swamping downstream resources. https://docs.microsoft.com/en-us/azure/azure-functions/functions-premium-plan#plan-and-sku-settings
We prioritized that above a max call per X limit, because our only control to limit throughput is to outright deny some number of requests above a threshold.
Keep the feedback coming!
Azure Functions Team
Hello, we are having a scenario where we should be able to trigger a function when a message is added to RabbitMQ. This would be really helpful in using Azure functions to integrate with other Message queuing platforms that support AMQP protocol.70 votes
Great request and have heard a bit on this over the last few weeks. Keep the votes coming
Enable developers to create serverless applications using the ASP.NET Core Web API framework.63 votes
Great ask. Keep the votes coming. Nothing planned short term.
Azure functions are great, but a lot of application logic is driven by data in SQL Azure DBs. It would be great to have a trigger based on a SQL Azure row data.56 votes
Moving this to unplanned to better reflect status. The previous update hasn’t changed, but since the work is on the Event Grid/SQL teams and we’re not planning on building a custom binding ourselves, this is unplanned.
I want to be able to handle multiple messages per function call - we are using a third-party web service which works much more efficiently if the messages are passed to it in batches. I know we could use a timer to trigger the function and grab messages from the queue ourselves but if we do that the function won't scale out automatically.51 votes
Nothing planned short term but a great ask. This is an item that could potentially be contributed if interest to do so earlier. Hoping we get this planned soon.
Allow create, update and delete on a table storage row to trigger an Azure Function.47 votes
Likely best solved in future with Event Grid support so would also be good to let storage team know you want event grid for table storage.
Function App v1. could create an OpenAPI (swagger) file. After the change to v2. in the portal creating an OpenAPI file is "not supportet".
Please reinstate this feature38 votes
We have just released integration with API management as there is a consumption plan. Please review https://docs.microsoft.com/en-us/azure/azure-functions/functions-openapi-definition to see if this meets your needs.
Currently Websockets ( SignalR ) like continuous stream feature is not possible with Azure server less, please add this feature so we can make more use of azure functions.37 votes
Looks like the community (more specifically Anthony Chu) has already taken a stab at this. Please see the following article (https://anthonychu.ca/post/cosmosdb-real-time-azure-functions-signalr-service/) and GitHub repo (https://github.com/Azure/azure-functions-signalrservice-extension) for details of his implementation.
Your feedback on whether this satisfies your scenarios would be appreciated.
Right now you if you want to use zip deploy you need to use user credentials, those can be used to deploy any function anywhere in your tenant. Not really the best way to handle DTAP. Would be nice if you could also the app level credentials which do work for the other deployment options.34 votes
This is a valid request but not something that is planned at this time. Keep the votes coming.
If I build a web app using Azure B2C for authentication, I'd like to use it as the authentication provider for Azure Functions too. Although I can set up the same providers, using B2C would be cleaner and enable me to support users with local logins (traditional username and password) too.32 votes
This item is planned for the next year.
Are there any plans to support R scripts in the future?31 votes
Thanks for your feedback. Currently, we don’t have any plans for R support in Functions. We will monitor the votes on this request to accordingly inform the priority on our backlog.
Make it possible to set/retrieve function app keys via ARM templates.
This will make it easier to store it in the settings of an other service while rolling out an environment.29 votes
We agree this needs to happen and we’re figuring out the best path forward.
This would make hybrid integration scenarios for customers with no IT trivial, anybody can drop a file in a share.27 votes
The below is still accurate, but updating to better reflect product backlog. Keep the votes coming.
Azure Functions should be able to be triggered from Apache Kafka. The triggered function should be able to be configured for a specific consumer group, with options to explicitly commit the consumer's offset.
Ideally Kafka Streams could be supported, with special support for local storage of KTables.25 votes
No change, but moving this to unplanned to better reflect project status.
If you have too many connections, you can get SocketExceptions. The dynamic tier was meant to stop us from having to think about server instances, but with a connection limit, the dynamic tier is useless and we are back to the standard service plans.24 votes
This is an awesome idea, and we’re exploring a few options to make it a reality.
However, the 600 connection limit per instance should be enough for most applications if you’re reusing or closing connections. If you truly need 600 open connections you are likely to run into the 10 minute timeout per execution.
Even after we add this you will still need to be mindful of your connection management.
Keep the votes coming!
Currently you can only configure Azure Alerts on the App Plan that is hosting your Azure Function. However, it would be great to do this on a Function level to detect failures and other metrics.24 votes
We have been working on this, and are a few months out from enabling it.
Azure Functions team
This is partially supported today. You can click through to the App Service Settings blade and click on the graph that shows up there.
This lets you set up basic alerts on failure rates/etc or GB-sec thorughput.
We’re adding Application Insights integration soon and that will provide even more options for setting up more granular alerts: github.com/Azure/azure-webjobs-sdk-script/i..
-Chris, Functions PM
We are using a Function App for consuming events from an EventHub, to process each event we make 4-5 REST calls and we find that our S3 instances run out of outbound connections resulting in ETIMEDOUT errors. We opened a support case and we are troubleshooting. However to make it easier to debug and to allow us to put alerts in place it would be nice to have the number of open connections available through the Portal and be able to attach alerts..
We tried to run netstat from Kudu but thisis not permitted (Access Denied)23 votes
Update: Still planned!
This is something that we have enabled internally, and are in the planning process of highlighting TCP connections for customers in the “Diagnose and Solve problems” tab. However, we do not have an ETA yet.
Thanks for the feedback!
Azure Functions Team
- Don't see your idea?