Nick

My feedback

  1. 1 vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    Thanks for the valid suggestion. Your feedback is now open for the user community to upvote which allows us to effectively prioritize your request against our existing feature backlog and also gives us insight into the potential impact of implementing the suggested feature.

    Nick shared this idea  · 
  2. 34 votes
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    1 comment  ·  Azure Databricks  ·  Flag idea as inappropriate…  ·  Admin →
    Nick supported this idea  · 
  3. 722 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    41 comments  ·  Data Factory  ·  Flag idea as inappropriate…  ·  Admin →
    Nick commented  · 

    This is very important for us as we want to run the SFTP upload on a self-hosted integration runtime due to IP whitelisting on the server that we upload to. i.e. I can create a databricks or azure functions upload, but we can't really whitelist the public ips used by the these systems. i.e. We need to the upload to come from our IP address which means a copy activity.

    Nick supported this idea  · 
  4. 114 votes
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    12 comments  ·  Azure Analysis Services  ·  Flag idea as inappropriate…  ·  Admin →

    Pleas see Kay’s comment below. Now that AS works on PQ, this of course becomes much easier. However still some work & testing to make each connector available, so likely will evaluate which are more suited to traditional BI and prioritize accordingly

    Nick commented  · 

    I would dearly love to see the Azure Databricks spark connector available for Azure Analysis Services.

    The work around at the moment is to create external Polybase tables in Azure Data Warehouse which read the underlying Parquet files. This just adds unneeded complexity and cost to our architecture as we don't want/need to store any data in Azure SQL DW.

    I haven't seen much progress at all in Azure Analysis Services lately so perhaps all the attention is elsewhere at the moment?

    Nick supported this idea  · 
  5. 1,510 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    57 comments  ·  Data Factory  ·  Flag idea as inappropriate…  ·  Admin →

    Thank you for your suggestion. We understand it is super important to whitelist specific IP list for ADF as part of firewall rules and avoid opening network access to all. We are working on this with high priority. Once this is ready, we will also add ADF to the list of “Trusted Azure service” for Azure Storage and Azure SQL DB/DW.

    Nick commented  · 

    Yes, we also have the case of an external SFTp site that requires whitelisting to connect.

    Nick supported this idea  · 

Feedback and Knowledge Base