Good request and thanks for insight, keep the votes coming. No current plans on this
Great ask. Keep the votes coming. Nothing planned short term.
I will copy here some comments I made previously on GitHub (https://github.com/Azure/azure-functions-host/issues/2557) to provide some arguments in favor on that approach:
Using Azure Functions V2, I was able to have Razor Pages running properly inside a Functions host using the TestServer approach: http://www.luckenuik.net/hosting-your-aspnet-core-razor-pages-inside-azure-functions/
This is a proof of concept, but if we develop something similar to TestServer of production grade (and proper tooling to ease the job), we should be able to leverage the full power of MVC APIs\Pages inside Azure Functions.
I do understand that Azure Functions is meant to target multiple languages and that this is a .NET only approach, but I believe there is a need for such an approach. Obviously, something built-in Azure Functions would be preferable...
I know there are some people out there against running AspNet Core MVC inside Azure Functions, but the complete discussion around "it will take more memory, increase start-up time and adds overhead" is meaningless in my view, at least for some scenarios:
-Startup time: we are already facing Cold Starts today, that's not going away, adding few milliseconds to initialize the MVC stack is irrelevant. With Azure Functions Premium plan, these concerns will be less relevant, I guess.
-Execution Cost: No matter how much memory you are using, for some apps it is still cheaper than having an App Service running 24/7.
-Ramp-up Cost: It is way cheaper to take the existing knowledge of developers and make them code as usual with MVC Controllers instead of ramping-up with Azure Functions.
-Reuse Opportunity: A lot of work was done by different projects to increase the AspNet Core middleware with interesting features (Ex: Authn\Authz), it is a shame that we cannot reuse that work easily with Azure Functions.
-Complexity: If you want to run any business application on Azure Functions, you will face complex code that might take longer than 50 milliseconds. It's not true, in my view, that Azure Functions was designed for small projects only, with quick "in and out" only.
That would help minimize ramp-up time for developers coming from the Web API world.
This would also ease migration of some API based applications to Azure Functions.
Work has started on both moving capabilities that are only available in the CDN Supplemental Portal (e.g. rules engine) into the Azure Portal and also providing API’s to support all of these features. This work will be done in multiple phases over the next several months.
What is the current status for this improvement? Configuring CDN rules manually is tedious and error prone.
Rule approval is currently automatic with no delay. We will follow up on having the 4 hour approval message you see in the CDN supplemental portal updated to remove this confusion. While approval is automatic it can currently take 90 minutes for updates to propagate to all CDN POPs. Work is under way to significantly reduce this to a much lower value in the next few months.
According to this issue, the approval is still a manual process: https://github.com/MicrosoftDocs/azure-docs/issues/13983#issuecomment-416395573
Yesterday, I posted updated a rule against a Verizon Premium CDN that took over 8 hours to be approved.
The rule tester UI would enable us to test the rules prior submitting it. It could be something very simple, enter a URL + Request Headers, run the rules and provide the returned response headers and destimation of the request in the Source.
Currently, to figure out if something I want to do would work as expected, I have to figure out the rule syntax, add it, wait between 10 minutes and 8 hours, test to see if it works, repeat if not working. A real pain.
The Verizon CDN approval is human driven: https://github.com/MicrosoftDocs/azure-docs/issues/13983#issuecomment-416395573
Thank you Rasmus for your feedback!
Something similar in preview, country level only : https://docs.microsoft.com/en-us/rest/api/maps/geolocation/getiptolocationpreview
13 votes0 comments · Azure Monitor-Application Insights » Metrics & charting · Flag idea as inappropriate… · Admin →
You can now use the Azure CDN to access blobs with custom domains over HTTPS. See the following article for instructions on how to do so: https://docs.microsoft.com/en-us/azure/storage/storage-https-custom-domain-cdn. Having talked to a number of customers, we concluded that this solution addresses many scenarios where the need for HTTPS access to blobs with custom domains exists.
Native Azure Storage support for using SSL to access blobs at custom domains is still on our backlog. We would love to hear about your scenarios where using the Azure CDN is not an acceptable solution, either by posting on this thread or sending us an email at firstname.lastname@example.org.
Apologies for the confusion on our part.
This is currently not on our roadmap. However, we will review this item in the future as we prioritize future releases.
Thanks for your suggestion.
Support for creating a container in ARM templates is now done. We have begun work to support Tables and Queues.
Here is a link to a sample template to create a container: https://azure.microsoft.com/en-us/resources/templates/101-storage-blob-container/
This is on our long term roadmap.
Work on AAD Support for control plane has started. Will update here as this feature moves into generally availability.
Thank you for your suggestion and votes.
Azure Cosmos DB now provides the ability to access the change feed (https://docs.microsoft.com/en-us/azure/cosmos-db/change-feed). A common pattern to implement eventing patterns is
1. Store every version/change as a separate item
2. Read the change feed to merge/consolidate changes and trigger appropriate actions downstream.
You can also expire old versions using TTL: https://docs.microsoft.com/en-us/azure/cosmos-db/time-to-live
Nothing planned short term but a great ask. This is an item that could potentially be contributed if interest to do so earlier. Hoping we get this planned soon.
Moving this work item to unplanned, as it is clear that this request is no for a global throughput limit.
We do now offer the ability to limit your maximum instances in the Premium plan, which will allow you to avoid swamping downstream resources. https://docs.microsoft.com/en-us/azure/azure-functions/functions-premium-plan#plan-and-sku-settings
We prioritized that above a max call per X limit, because our only control to limit throughput is to outright deny some number of requests above a threshold.
Keep the feedback coming!
Azure Functions Team