Sachin C Sheth

My feedback

  1. 2 votes
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    1 comment  ·  Data Lake  ·  Flag idea as inappropriate…  ·  Admin →
    Sachin C Sheth commented  · 

    Thanks for your feedback. There is no current way to do this. We will consider in our future planning cycles.

    BTW, will something like this - https://azure.microsoft.com/en-us/blog/control-azure-data-lake-costs-using-log-analytics-to-create-service-alerts/ - suffice or you are looking for a separate mechanism?

  2. 2 votes
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    2 comments  ·  Data Lake  ·  Flag idea as inappropriate…  ·  Admin →
    Sachin C Sheth commented  · 

    Service Principal authentication in ADF is supported for Azure Data Lake Store already (https://docs.microsoft.com/en-us/azure/data-factory/data-factory-azure-datalake-connector search for "Service princpal authentication"). Service Principal authentication in ADF for Azure Data Lake Analytics is being worked on and should be available shortly.

  3. 25 votes
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    under review  ·  4 comments  ·  Data Lake  ·  Flag idea as inappropriate…  ·  Admin →
    Sachin C Sheth commented  · 

    Hi,

    Thanks for your suggestions. Had a few clarifying questions, so that we can understand your requirements better.

    What is your description of a lambda architecture please? My understanding is that this done currently by having multiple readers forking data off from the message broker (EventHubs, Kafka etc.) to support a cold path and a hot path. How will supporting EventHubs as a stream input data type in Azure Data Lake make lambda architectures better?

    Also, you indicated that you are data copy operations many times. Can you please explain where do you that and what stores do you copy to and what processing you do on each copy?

    Thanks,
    Sachin Sheth
    Program Manager
    Azure Data Lake

Feedback and Knowledge Base