It would be useful to have a form of DI in our C# Functions, so we can consistently inject things like our logger class, or other Autofac DI services, into every C# Function we create.
I'd like to create a SDK package which developers can import into their Function and get access to some common functionality via DI, to remove boilerplate redundancy.
Any solution for this kind of thing?323 votes
We’re happy to announce support for Dependency Injection for C#!
See the following content for reference:
Currently the queue/subscription trigger only works with queues and subscriptions that does not use sessions (RequiresSession = false). It would be very useful if the runtime would detect that sessions are enabled and adapt accordingly. As far as I can see, the actual function code would not even have to be aware.160 votes
We’ve recently shipped support for session enabled service bus
Integrated support for Azure Functions projects in Visual Studio including being able to execute and test locally.104 votes
Please support writing TypeScript in Azure Functions when you're creating Node functions89 votes
Please see https://azure.microsoft.com/en-us/blog/improving-the-typescript-support-in-azure-functions/, improved typescript support has been released.
It would be great to have out-of-the-box integration with Application Insights. The Invocation log is ok but Application Insights much better.
How I see it:
- HTTP requests as Request telemetry
- TraceWriter.Info/Warning/Verbose as Custom Event
- TraceWriter.Error as Exception
- Input/Outputs as Dependency calls
The instrumentation key and other related configs could be placed in the host.json:
Application Insights integration was made generally available last week!
Azure Functions is awesome! I'd like to propose for more convenience.
Currently, Storage blob triggers do NOT support real-time. Because write at
"The Functions runtime scans log files to watch for new or changed blobs. This process is not real-time;"
It is very inconvenience for me. I feel troublesome to use queue.
Amazon Lambda has storage event. They can notice real-time and very easy setting.49 votes
Azure Event Hub can now be used to accomplish real-time reactions to blob changes. Follow the instructions here to subscribe your event hub subscription to blob updates: https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blob-event-quickstart
Then, use the event hub trigger to execute your functions.
Thanks for the feedback!
It is possible to create a Function from a proper compiled class (.cs file), see https://github.com/Azure/azure-webjobs-sdk-script/wiki/Precompiled-functions.
However the Visual Studio Tooling now available for creating Functions only supports C# Script (.csx file), it is not possible to add a class file to the project.
Please provide VS tooling for precompiled Functions.47 votes
Python is now available in public preview – https://azure.microsoft.com/en-us/blog/taking-a-closer-look-at-python-support-for-azure-functions/
Sounds like a perfect candidate for Azure Functions:
"A common design pattern in these applications is to track changes made to DocumentDB data, and update materialized views, perform real-time analytics, archive data to cold storage, and trigger notifications on certain events based on these changes. DocumentDB's Change Feed support allows you to build efficient and scalable solutions for each of these patterns."34 votes
See Matias’ comment below!
Azure Functions now has a .NET Core runtime, which was announced as a preview at Ignite 2017. You can read more about the support here: https://aka.ms/funccrossplattools
I'd like to be able to call other functions via an internal "ID" and worst case not to have to go back out to the internet and then come back in to Azure to call another function.29 votes
Azure Durable Functions (now in preview) allows you to invoke another function by its function name. We have evaluated allowing for cross-app invocation but may have its own constraints as well – feel free to comment or create an additional item if cross-app or cross-region invocation may be needed as well to get votes on urgency/demand.
Need to allow azure functions running in consumption plan (with easy scaling) to be able to use ReportViewer to generate PDF, Excel and Word exports. Only appears to be possibly available to be used with Azure Functions in App Service plan https://azure.microsoft.com/en-us/blog/pdf-generation-and-loading-file-based-certificates-in-azure-websites/24 votes
Hello, Azure Functions on Consumption plan runs in a sandbox documented here: https://github.com/projectkudu/kudu/wiki/Azure-Web-App-sandbox. Some PDF operations are allowed as described there. For those that are not allowed you’ll need an App Service plan.
I went to set up Azure functions for a new client and had to delete and re-create my application so I could downgrade to runtime 1.0 in order to use a powershell function. Even AWS has realized the proliferation of Powershell and now allows it to be used in Lambda functions. I have probably thousands of administrative scripts in my archives that could easily be ported to Azure functions. I'd move to AWS before I'd invest the time to re-write them all.18 votes
PowerShell support in now available. https://docs.microsoft.com/en-us/azure/azure-functions/functions-create-first-function-powershell
Thanks for providing feedback.
It would be great to have high-level monitoring available that visualizes all the functions in an instance, similar to how Azure Data Factory does it in the Monitor-feature.
Following information would be interesting:
- Failure rate per function
- Dependency metrics per binding, per function _(similar to AI Dependency)_
- Function state _(enabled/disabled)_
- ...18 votes
We have enabled the App Insights Application Map to support this functionality. It will show failure rate and dependency metrics per Function App. Read more about it here https://azure.microsoft.com/en-us/blog/introducing-azure-functions-2-0/
Thanks for the feedback!
I want IP restriction in Azure Functions Proxies. My customer has a security policy that IPs are whitelisted. Current solutions are to host AppServices plan or API management. This feature is essential for enterprise customers.17 votes
This is now possible using the “IP Restrictions” functionality available under “Platform Features” > “Networking” > “IP Restrictions”
Keep in mind that this list is a whitelist.
Thanks for the feedback!
Hi, it's great that we can upload some binaries and than referencing to dlls directly, also- we can reference to nuget in project.json https://docs.microsoft.com/en-us/azure/azure-functions/functions-reference-csharp#package-management
But project.json was announced as a legacy thing even for asp.net core. in RC1 for VS2017- you will see that even in asp.net core we are not using project.json anymore.
We need some consistent/up-to-date experience with other product.16 votes
.NET Class libraries is best and most modern way for NuGet management
As a person who does ALM, I want the ability to package and deploy my Azure functions using MSDeploy packages so that I can integrated Azure functions into my (VSTS) build pipeline.15 votes
This works today! You can Web Deploy directly to the Function App. You can get your publish settings from the App Service Settings blade (Function App Settings > App Service settings > Export Publish Profile)
You can also upload a WebDeploy package via ARM. Here’s a sample: https://gist.github.com/christopheranderson/ec6ab6b1eae152fca5321f4044b33d1d#file-azuredeploy-json-L113
-Chris, Functions PM
Dedicated mode functions now have virtual network and hybrid connection support.
Thanks for your patience!
Please support python based Azure functions that run in a Linux Docker container.
Linux Docker containers are already supported so we are half of the way there and Python is working/supported on windows preview.
Currently if you try to create a simple python based Azure function on Linux preview you get the error: HttpTriggerPy1: Object reference not set to an instance of an object.
The use case here is being able to run "enterprise" python 2 environments with hard dependencies on Linux. I can verify python runs correctly and can send an example Dockerfile if needed.14 votes
Running Python Functions in a Linux-based container is already supported today. More info here – https://docs.microsoft.com/en-us/azure/azure-functions/functions-create-function-linux-custom-image
However, if you’re trying to run Python 2, extending the default image will not work since the language worker only supports 3.6.x or later. Instead, you’ll need to write your own language worker for the required version.
Currently only 32 bit functions supported.13 votes
The option is now enabled in the UI.
- Don't see your idea?