Nick

My feedback

  1. 1 vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    Thanks for the valid suggestion. Your feedback is now open for the user community to upvote which allows us to effectively prioritize your request against our existing feature backlog and also gives us insight into the potential impact of implementing the suggested feature.

    Nick shared this idea  · 
  2. 40 votes
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    1 comment  ·  Azure Databricks  ·  Flag idea as inappropriate…  ·  Admin →
    Nick supported this idea  · 
  3. 844 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    44 comments  ·  Data Factory  ·  Flag idea as inappropriate…  ·  Admin →
    Nick commented  · 

    This is very important for us as we want to run the SFTP upload on a self-hosted integration runtime due to IP whitelisting on the server that we upload to. i.e. I can create a databricks or azure functions upload, but we can't really whitelist the public ips used by the these systems. i.e. We need to the upload to come from our IP address which means a copy activity.

    Nick supported this idea  · 
  4. 119 votes
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    13 comments  ·  Azure Analysis Services  ·  Flag idea as inappropriate…  ·  Admin →

    Pleas see Kay’s comment below. Now that AS works on PQ, this of course becomes much easier. However still some work & testing to make each connector available, so likely will evaluate which are more suited to traditional BI and prioritize accordingly

    Nick commented  · 

    I would dearly love to see the Azure Databricks spark connector available for Azure Analysis Services.

    The work around at the moment is to create external Polybase tables in Azure Data Warehouse which read the underlying Parquet files. This just adds unneeded complexity and cost to our architecture as we don't want/need to store any data in Azure SQL DW.

    I haven't seen much progress at all in Azure Analysis Services lately so perhaps all the attention is elsewhere at the moment?

    Nick supported this idea  · 
  5. 1,783 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    69 comments  ·  Data Factory  ·  Flag idea as inappropriate…  ·  Admin →

    We want to share the great news that ADF has been added to the list of “Trusted Azure service” for Azure Key Vault and Azure Storage (blob & ADLS Gen2)!! Now you can enable “Allow trusted Microsoft services” on AKV and Azure Storage for better network security, and your ADF pipelines will continue to run. There are two caveats to pay attention to: (1) In order for ADF to be considered as one of the “Trusted Microsoft services” you need to use MSI to authenticate to AKV or Azure Storage in the linked service definition, and (2) If you are running Mapping Data Flow activity – “Trusted Azure service” is not supported for Data Flow just yet and we are working hard on it.

    What is coming up? Here are the additional enhancements we are making for better network security:
    - Static IP range for Azure Integration Runtime so that…

    Nick commented  · 

    Yes, we also have the case of an external SFTp site that requires whitelisting to connect.

    Nick supported this idea  · 

Feedback and Knowledge Base