Thanks for your feedback. There is no current way to do this. We will consider in our future planning cycles.
BTW, will something like this - https://azure.microsoft.com/en-us/blog/control-azure-data-lake-costs-using-log-analytics-to-create-service-alerts/ - suffice or you are looking for a separate mechanism?
Service Principal authentication in ADF is supported for Azure Data Lake Store already (https://docs.microsoft.com/en-us/azure/data-factory/data-factory-azure-datalake-connector search for "Service princpal authentication"). Service Principal authentication in ADF for Azure Data Lake Analytics is being worked on and should be available shortly.
Thanks for your suggestions. Had a few clarifying questions, so that we can understand your requirements better.
What is your description of a lambda architecture please? My understanding is that this done currently by having multiple readers forking data off from the message broker (EventHubs, Kafka etc.) to support a cold path and a hot path. How will supporting EventHubs as a stream input data type in Azure Data Lake make lambda architectures better?
Also, you indicated that you are data copy operations many times. Can you please explain where do you that and what stores do you copy to and what processing you do on each copy?
Azure Data Lake