C# and other environments are already supported; it would be nice if Go / Golang was one of them. I'd love to handle web requests using Golang + Azure Functions.
It could provide the "HTTP Requests" via the stdin, while the "HTTP Response" can be given via the stdout. (or whatever those two are called on Windows ;-) )179 votes
Outside of out of the box providers like facebook and google, provide samples and guidance on how to authenticate with others like LinkedIn.149 votes
I’ve gotten a few questions about this item recently, so I just wanted to give a more detailed status. We still have this on the backlog; it hasn’t been forgotten. But we don’t have a clear timeline for when we would get to it right now. The “unplanned” status just means that it can’t be tied to a timeline, but we do think this is a valid request that we would like to have in the product.
Hello, we are having a scenario where we should be able to trigger a function when a message is added to RabbitMQ. This would be really helpful in using Azure functions to integrate with other Message queuing platforms that support AMQP protocol.94 votes
We just shipped a preview of this functionality today you can try out. In the coming weeks we will enable support to trigger within the Azure Functions Premium plan to receive serverless scale and a function plan to RabbitMQ endpoints within a private network. Please provide feedback and let us know if this meets needs as expected. Details can be found here:
Now- any light ddos attack that Azure will not recognize- will affect me and my account. If I know that my service shouldn't receive more than 10000 calls per day, but I can't setup limits on incoming requests.
"Daily Usage Quota (GB-Sec)"- not bad idea, but it's something internal and synthetic for me. Call/per day- is much more native metrics for users.79 votes
Moving this work item to unplanned, as it is clear that this request is no for a global throughput limit.
We do now offer the ability to limit your maximum instances in the Premium plan, which will allow you to avoid swamping downstream resources. https://docs.microsoft.com/en-us/azure/azure-functions/functions-premium-plan#plan-and-sku-settings
We prioritized that above a max call per X limit, because our only control to limit throughput is to outright deny some number of requests above a threshold.
Keep the feedback coming!
Azure Functions Team
Enable developers to create serverless applications using the ASP.NET Core Web API framework.76 votes
Great ask. Keep the votes coming. Nothing planned short term.
Azure functions are great, but a lot of application logic is driven by data in SQL Azure DBs. It would be great to have a trigger based on a SQL Azure row data.65 votes
This continues to be unplanned. Please keep the votes coming!
Could we please have native support to run Azure Functions written in Rust? To host Rust functions currently we have to pay for a Linux App Service plan and embed our functions within a Docker container. This is expensive just to run a few microservices written in Rust.
Peter Huene has an excellent project for anyone interest in writing Azure Functions in Rust....62 votes
Agree super cool project from Peter – and great to see interest. Recommend checking that out in the interim. Keep the votes coming but no immediate plans to pull this in as a first class language yet.
I want to be able to handle multiple messages per function call - we are using a third-party web service which works much more efficiently if the messages are passed to it in batches. I know we could use a timer to trigger the function and grab messages from the queue ourselves but if we do that the function won't scale out automatically.55 votes
Nothing planned short term but a great ask. This is an item that could potentially be contributed if interest to do so earlier. Hoping we get this planned soon.
Now obviously Google has a bit of an advantage here what with Chrome being their own tech but there are so many blogs out there with people talking about complicated Docker solutions just to run headless Chrome in the cloud.
It would be great if we could get a solution where we could run Puppeteer in much the same way, or allow web apps to run Chrome without having to be inside Docker and we could manage a pool of Chromes without having to manage Docker as well.54 votes
Thank you for the feedback. We have moved this item to the Azure Functions section for the PG to take a look.
Allow create, update and delete on a table storage row to trigger an Azure Function.50 votes
Likely best solved in future with Event Grid support so would also be good to let storage team know you want event grid for table storage.
Function App v1. could create an OpenAPI (swagger) file. After the change to v2. in the portal creating an OpenAPI file is "not supportet".
Please reinstate this feature46 votes
We have just released integration with API management as there is a consumption plan. Please review https://docs.microsoft.com/en-us/azure/azure-functions/functions-openapi-definition to see if this meets your needs.
Are there any plans to support R scripts in the future?41 votes
Thanks for your feedback. Currently, we don’t have any plans for R support in Functions. We will monitor the votes on this request to accordingly inform the priority on our backlog.
Currently Websockets ( SignalR ) like continuous stream feature is not possible with Azure server less, please add this feature so we can make more use of azure functions.39 votes
Looks like the community (more specifically Anthony Chu) has already taken a stab at this. Please see the following article (https://anthonychu.ca/post/cosmosdb-real-time-azure-functions-signalr-service/) and GitHub repo (https://github.com/Azure/azure-functions-signalrservice-extension) for details of his implementation.
Your feedback on whether this satisfies your scenarios would be appreciated.
This would make hybrid integration scenarios for customers with no IT trivial, anybody can drop a file in a share.39 votes
The below is still accurate, but updating to better reflect product backlog. Keep the votes coming.
Right now you if you want to use zip deploy you need to use user credentials, those can be used to deploy any function anywhere in your tenant. Not really the best way to handle DTAP. Would be nice if you could also the app level credentials which do work for the other deployment options.37 votes
No change here due to low activity, leaving as unplanned. Keep the votes coming.
If I build a web app using Azure B2C for authentication, I'd like to use it as the authentication provider for Azure Functions too. Although I can set up the same providers, using B2C would be cleaner and enable me to support users with local logins (traditional username and password) too.33 votes
Just since it’s been a while, I wanted to reconfirm that this is planned.
Make it possible to set/retrieve function app keys via ARM templates.
This will make it easier to store it in the settings of an other service while rolling out an environment.31 votes
We do have APIs which make this easier, now. It involves the listKeys() method that may be familiar from other services.
We are working on formal documentation (which is why this isn’t flagged as “completed” yet), but some of the details are captured in this GitHub issue:
Make it possible to have a global exception handler in C# Azure Functions App project - to be able to handle exceptions in one central location.29 votes
We are starting to look into the feasibility of this and the experience that people are looking for. If you are interested to provide feedback on how you’d like to see this implemented please comment here https://github.com/Azure/azure-functions-host/issues/4459
If you have too many connections, you can get SocketExceptions. The dynamic tier was meant to stop us from having to think about server instances, but with a connection limit, the dynamic tier is useless and we are back to the standard service plans.27 votes
This is an awesome idea, and we’re exploring a few options to make it a reality.
However, the 600 connection limit per instance should be enough for most applications if you’re reusing or closing connections. If you truly need 600 open connections you are likely to run into the 10 minute timeout per execution.
Even after we add this you will still need to be mindful of your connection management.
Keep the votes coming!
Azure Functions should be able to be triggered from Apache Kafka. The triggered function should be able to be configured for a specific consumer group, with options to explicitly commit the consumer's offset.
Ideally Kafka Streams could be supported, with special support for local storage of KTables.27 votes
No change, but moving this to unplanned to better reflect project status.
- Don't see your idea?