Functions often need password, API keys, and connection strings to connect to other services and retrieve data. It would be great if those secrets could easily be obtained from Key Vault.398 votes
Thank you all for the comments and feedback. We’re very pleased to announce that support for Key Vault references is now generally available! You can find the update here: https://azure.microsoft.com/updates/general-availability-of-key-vault-references-in-app-service-and-azure-functions/
The work certainly doesn’t stop here. We are looking to add support for additional networking configurations, and work is underway for rotation handling (making the version string optional). Please consider putting votes towards these features as well! They are captured below.
Again, thanks for all of the input on this item. It really does make a difference.
It would be useful to have a form of DI in our C# Functions, so we can consistently inject things like our logger class, or other Autofac DI services, into every C# Function we create.
I'd like to create a SDK package which developers can import into their Function and get access to some common functionality via DI, to remove boilerplate redundancy.
Any solution for this kind of thing?323 votes
We’re happy to announce support for Dependency Injection for C#!
See the following content for reference:
Currently the queue/subscription trigger only works with queues and subscriptions that does not use sessions (RequiresSession = false). It would be very useful if the runtime would detect that sessions are enabled and adapt accordingly. As far as I can see, the actual function code would not even have to be aware.160 votes
We’ve recently shipped support for session enabled service bus
Hello, we are having a scenario where we should be able to trigger a function when a message is added to RabbitMQ. This would be really helpful in using Azure functions to integrate with other Message queuing platforms that support AMQP protocol.107 votes
Thank you for the feedback! This feature is now GA.
Integrated support for Azure Functions projects in Visual Studio including being able to execute and test locally.104 votes
Please support writing TypeScript in Azure Functions when you're creating Node functions89 votes
Please see https://azure.microsoft.com/en-us/blog/improving-the-typescript-support-in-azure-functions/, improved typescript support has been released.
Now obviously Google has a bit of an advantage here what with Chrome being their own tech but there are so many blogs out there with people talking about complicated Docker solutions just to run headless Chrome in the cloud.
It would be great if we could get a solution where we could run Puppeteer in much the same way, or allow web apps to run Chrome without having to be inside Docker and we could manage a pool of Chromes without having to manage Docker as well.60 votes
We now have the necessary dependencies for running headless Chromium in the Linux Consumption plan! ⚡️ See this article to get started: https://dev.to/azure/running-headless-chromium-in-azure-functions-with-puppeteer-and-playwright-2fgk
Please try it out and file an issue here if you need help: https://github.com/Azure/azure-functions/issues
It would be great to have out-of-the-box integration with Application Insights. The Invocation log is ok but Application Insights much better.
How I see it:
- HTTP requests as Request telemetry
- TraceWriter.Info/Warning/Verbose as Custom Event
- TraceWriter.Error as Exception
- Input/Outputs as Dependency calls
The instrumentation key and other related configs could be placed in the host.json:
Application Insights integration was made generally available last week!
Azure Functions is awesome! I'd like to propose for more convenience.
Currently, Storage blob triggers do NOT support real-time. Because write at
"The Functions runtime scans log files to watch for new or changed blobs. This process is not real-time;"
It is very inconvenience for me. I feel troublesome to use queue.
Amazon Lambda has storage event. They can notice real-time and very easy setting.49 votes
Azure Event Hub can now be used to accomplish real-time reactions to blob changes. Follow the instructions here to subscribe your event hub subscription to blob updates: https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blob-event-quickstart
Then, use the event hub trigger to execute your functions.
Thanks for the feedback!
It is possible to create a Function from a proper compiled class (.cs file), see https://github.com/Azure/azure-webjobs-sdk-script/wiki/Precompiled-functions.
However the Visual Studio Tooling now available for creating Functions only supports C# Script (.csx file), it is not possible to add a class file to the project.
Please provide VS tooling for precompiled Functions.47 votes
Function App v1. could create an OpenAPI (swagger) file. After the change to v2. in the portal creating an OpenAPI file is "not supportet".
Please reinstate this feature46 votes
Additional work will be done in APIM given our native integration between functions and Azure API management.
Python is now available in public preview – https://azure.microsoft.com/en-us/blog/taking-a-closer-look-at-python-support-for-azure-functions/
Currently Websockets ( SignalR ) like continuous stream feature is not possible with Azure server less, please add this feature so we can make more use of azure functions.39 votes
Azure SignalR Service and the SignalR Service bindings for Azure Functions can be used together to send real-time messages over websockets from Azure Functions. See the docs for more details: https://docs.microsoft.com/en-us/azure/azure-signalr/signalr-concept-azure-functions
Sounds like a perfect candidate for Azure Functions:
"A common design pattern in these applications is to track changes made to DocumentDB data, and update materialized views, perform real-time analytics, archive data to cold storage, and trigger notifications on certain events based on these changes. DocumentDB's Change Feed support allows you to build efficient and scalable solutions for each of these patterns."34 votes
See Matias’ comment below!
Azure Functions now has a .NET Core runtime, which was announced as a preview at Ignite 2017. You can read more about the support here: https://aka.ms/funccrossplattools
I'd like to be able to call other functions via an internal "ID" and worst case not to have to go back out to the internet and then come back in to Azure to call another function.29 votes
Azure Durable Functions (now in preview) allows you to invoke another function by its function name. We have evaluated allowing for cross-app invocation but may have its own constraints as well – feel free to comment or create an additional item if cross-app or cross-region invocation may be needed as well to get votes on urgency/demand.
Azure Functions should be able to be triggered from Apache Kafka. The triggered function should be able to be configured for a specific consumer group, with options to explicitly commit the consumer's offset.
Ideally Kafka Streams could be supported, with special support for local storage of KTables.27 votes
We now have a Kafka trigger here: https://github.com/azure/azure-functions-kafka-extension.
Currently you can only configure Azure Alerts on the App Plan that is hosting your Azure Function. However, it would be great to do this on a Function level to detect failures and other metrics.26 votes
This is now supported, right now you can create an alert on an Azure Function App metric.
Need to allow azure functions running in consumption plan (with easy scaling) to be able to use ReportViewer to generate PDF, Excel and Word exports. Only appears to be possibly available to be used with Azure Functions in App Service plan https://azure.microsoft.com/en-us/blog/pdf-generation-and-loading-file-based-certificates-in-azure-websites/24 votes
Hello, Azure Functions on Consumption plan runs in a sandbox documented here: https://github.com/projectkudu/kudu/wiki/Azure-Web-App-sandbox. Some PDF operations are allowed as described there. For those that are not allowed you’ll need an App Service plan.
I went to set up Azure functions for a new client and had to delete and re-create my application so I could downgrade to runtime 1.0 in order to use a powershell function. Even AWS has realized the proliferation of Powershell and now allows it to be used in Lambda functions. I have probably thousands of administrative scripts in my archives that could easily be ported to Azure functions. I'd move to AWS before I'd invest the time to re-write them all.18 votes
PowerShell support in now available. https://docs.microsoft.com/en-us/azure/azure-functions/functions-create-first-function-powershell
Thanks for providing feedback.
- Don't see your idea?