Azure Functions
Azure Functions is an event driven, compute-on-demand experience that extends the existing Azure application platform with capabilities to implement code triggered by events occurring in other Azure services, SaaS products, and on-premises systems. With Azure Functions, your applications scale based on demand and you pay only for the resources you consume. Azure Functions provides an intuitive, browser-based user interface allowing you to create scheduled or triggered pieces of code implemented in a variety of programming languages.
-
console.log works as well as context.log in node.js
In a function written in node.js, console.log should log to the error log/console just as context.log does.
It would allow using the same debug in code that is been run both in standalone node.js and in function node.js. It is painful to have to alter the calls when moving code between the environments.
11 votesWe still really want to do this and are evaluating ways it may be possible with Node 12, but for now unplanned is still the proper status.
—Colby
-
Quartz Timer Feature
Please enhance Azure Functions or Azure in general that exposes an API against which we can post quartz like timer jobs.
Would be great to set a timer in Azure that can eventually trigger an Azure Function when executing.
8 votesStill nothing planned but interested in votes on this. – Jeff
-
Support for queue settings per function
As I was looking for a way to configure the visibility timeout for queue messages I found that the only configuration seemed to be at the host level. The problem is that my queues have different needs and one setting will not work optimally for all queues. I am looking for a way to configure each queue independently.
16 votesGood request and thanks for insight, keep the votes coming. No current plans on this
-
Jeff -
37 votes
Apologies on last message – we have started work for controlling checkpoints, not yet for resetting checkpoints. I updated the wrong issue. This is still planned though.
-
Jeff -
Folders for organizing Functions and a TreeView Function List in the Portal
Already I have a Functions App that is growing larger than works well in the existing Portal interface.
I would like to suggest adding support for folders (similar to solution folders in Visual Studio) that allow one to organize functions. And then a TreeView for the Function list in the Portal.
For example I have several processed in my Function App. Each process may have one or more steps that are usually individual Functions. It would be great to put all the related Functions together in a folder.
30 votesThis remains unplanned.
—Colby
-
Search input/output bindings
Want to treat documents of the Azure Search as JSON from the Function Apps.
First about input bindings. See following pseudo function.json.
{
"bindings": [
{
"name": "req",
"type": "httpTrigger",
"direction": "in"
},
{
"name": "search_result",
"type": "search",
"direction": "in",
"serviceName": "mybookshelf",
"indexName": "book",
"apiKey": "envname-SEARCH_ADMIN_KEY",
"queryType": "full",
"searchMode": "all",
"searchFields": ["id", "title", "author"],
"search": "title:{title} AND genre:{genre}"
},
{
"name": "res",
"type": "http",
"direction": "out"
}
]}
When an http request is arrived, contents of $search_result likes as followings.
[
…{
"id": "e583e025-01f4-4288-8dec-be6723afe607",
"title": "Dragon Ball vol.1",
"author": "Akira TORIYAMA"
},
{
"id": "bb0086f9-e924-450c-b615-f51fb730e339",
"title": "Dragon Ball vol.2",2 votesThis remains unplanned.
-
The dynamic tier should never run out of sockets
If you have too many connections, you can get SocketExceptions. The dynamic tier was meant to stop us from having to think about server instances, but with a connection limit, the dynamic tier is useless and we are back to the standard service plans.
37 votesThis is an awesome idea, and we’re exploring a few options to make it a reality.
However, the 600 connection limit per instance should be enough for most applications if you’re reusing or closing connections. If you truly need 600 open connections you are likely to run into the 10 minute timeout per execution.
Even after we add this you will still need to be mindful of your connection management.Keep the votes coming!
—Alex -
Support etags for Azure Table Storage read/write operations
I use python functions to read/write into Azure tables, unfortunately the etags are hidden. It would be quite useful to be able to access the eTags to ensure consistency of writes.
2 votesYearly update: this is still accurate.
-Colby
-
Function App Service support for Azure B2C
If I build a web app using Azure B2C for authentication, I'd like to use it as the authentication provider for Azure Functions too. Although I can set up the same providers, using B2C would be cleaner and enable me to support users with local logins (traditional username and password) too.
42 votesJust since it’s been a while, I wanted to reconfirm that this is planned.
-
Functions Summary/Status on Function App dashboard/home
It would be helpful to have a dashboard on the Function App home page to display summary/status data about all the Functions in the Function App. For example I currently have to go to each Function individually and click Monitor to see Recent Success or Error Count or to the Manage tab to see if a Function is Enabled.
The current "The faster way to functions" getting started content is really only relevant for new users.
9 votesThis work is planned with an upcoming rewrite of the Azure Functions portal experience. You can expect to see some of the improvements in the next few months.
—Colby
-
Better support for SharedAccessSignature for inputs and outputs
I have a Azure Function with an Azure Table as input, and I was trying to use a SharedAccessSignature in the connection string. I wanted to limit access to a specific table, and provide no access to other Azure storage services. However, when I try to run the function, there are errors saying that no Blob endpoint is configured, no Queue endpoint is configured, etc.
I would like to have a connection string with only TableEndpoint and a SharedAccessSignature, and support signatures that limit access to a specific table or even a specific subset of a table (partition key range).…
23 votesStill currently unplanned, good request – keep votes coming – Jeff
-
Should support trigger for Azure Files
This would make hybrid integration scenarios for customers with no IT trivial, anybody can drop a file in a share.
66 votesThe below is still accurate, but updating to better reflect product backlog. Keep the votes coming.
-Colby
-
Support for R
Are there any plans to support R scripts in the future?
80 votesThanks for your feedback. Currently, we don’t have any plans for R support in Functions. We will monitor the votes on this request to accordingly inform the priority on our backlog.
-
Sync code from blob storage
I find the current deployment model way too difficult to work with. Please consider supporting auto-sync from blob storage, similar to what aws lambda offers.
4 votesJust pinging that this is still the proper state for this feature request.
-
Add max calls/per day|hour|minute configuration for throttling
Now- any light ddos attack that Azure will not recognize- will affect me and my account. If I know that my service shouldn't receive more than 10000 calls per day, but I can't setup limits on incoming requests.
"Daily Usage Quota (GB-Sec)"- not bad idea, but it's something internal and synthetic for me. Call/per day- is much more native metrics for users.
88 votesHere’s the latest as there seem to be 2 types of ask here and so two seperate updates. Need comments for if this issue should close to be focused on one or other:
1. I want to control how many calls my function can make to another API (the 3rd party API rate limiting).
– In all plans we now have a way to specify the max instances. This can limit how far a function app instance can scale: https://docs.microsoft.com/en-us/azure/azure-functions/functions-scale#limit-scale-out2. I want to stop my function from triggering more than x times an hour.
Nothing planned in this in the short term. Using API Management for HTTP functions with throttles would be our recommendation for HTTP, nothing out of box for non-HTTP triggers yet. -
Autocomplete and intellisense on a portal
Hi,
it's cool that we have source code editor, but without Autocomplete and intellisense it's really hard write and debug code directly on a porta.
Few years ago was project Monaco on a classic azure porta. could you reuse it add autocomplete somehow else?!14 votesThis remains unplanned.
—Colby
-
Add Auth0 to list of identity providers and allow custom URL in 'URL Authorization Rules
[This is more an AppService issue but there's not forum for that.]
It's important to me to have the ability to use a custom authentication UI and Auth0 provide this and their own database for user storage. Otherwise I have to roll my own. My users do not have any existing accounts (including email) and so need a simple username / password experience (as horrible this might be for most of us)
A reasonable list of providers is available but adding Auth0 will open the possibilities up. Auth0 is not only excellent but acts as a proxy to a very…
24 votesThis remains unplanned – please keep the votes coming!
-
Handle Storage Queue/Service Bus messages in batches
I want to be able to handle multiple messages per function call - we are using a third-party web service which works much more efficiently if the messages are passed to it in batches. I know we could use a timer to trigger the function and grab messages from the queue ourselves but if we do that the function won't scale out automatically.
65 votesLeaving as started becasue we don’t have full docs out yet but this is completed. Just need to document and provide samples
-
SQL Azure trigger support
Azure functions are great, but a lot of application logic is driven by data in SQL Azure DBs. It would be great to have a trigger based on a SQL Azure row data.
110 votesThis is currently on our radar and is now planned work. Thanks for voting!
- Cary, Azure Functions
-
Output to Service Bus Queues Dynamically
I have an Azure Function that binds to an input Event Hub and an output Service Bus Namespace. The output binding accepts the name of a queue. However, the event data coming from the Event Hub can potentially go to different queues. The determinant is in the message data. It would be nice to be able to tokenize the queue destination based on event data coming from the Event Hub
7 votesThis remains unplanned.
—Colby
- Don't see your idea?