Azure Databricks

Azure Databricks is an Apache Spark-based analytics platform optimized for the Microsoft Azure cloud services platform. Designed with the founders of Apache Spark, Databricks is integrated with Azure to provide one-click setup, streamlined workflows, and an interactive workspace that enables collaboration between data scientists, data engineers, and business analysts.

We would love to hear any feedback you have for Azure Databricks.
For more details about Azure Databricks, try our documentation page.

  1. Tags should be inherited from the deployment resource group

    We've been trying to utilize tags in our environment to identify resources for billing purposes. We've put a policy in place that forces new resources to inherit this tag from its resource group if it wasn't given one explicitly. Most Azure resources are handled fine, but we've discovered that the tags don't get inherited by the resources created by databricks in the managed resource group. Instead, we have to individually assign the tags to each Databricks cluster. I'd like to see new clusters that inherit the tags of either the Managed Resource Group or the Databricks service itself.

    1 vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  2. job creation automation

    Is there any way to create ADB job in automated way. Currently I have to create jobs manually in all the environments.

    1 vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  3. Show the rowcount for rows returned in Notebooks

    The notebook shows the data returned but not the number of rows - otherwise I have to rerun the command with a select count(*) on it.

    This should be easy to implement

    6 votes
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  4. Azure Databricks should be FedRamp compliant across all US regions

    Azure Databricks seems to have every compliance certification other than FedRamp. I would think it would not be an issue to become FedRamp compliant, which would allow government data to be transformed in the platform.

    4 votes
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  5. Can we access Event hub in Databricks using service principal

    Event hub document shows that we can access it using service principal and also I see that, we can access event hub using service principal using python libraries.

    Can we access Event hub in Databricks using service principal instead of SAS

    1 vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  6. Implement access token auto refresh when using credential passthrough

    When a cluster is configured with credential passthrough we are getting an access denied error after 1 hour of running a notebook due to the AD access token expiration. Because of that, it would be nice to have the access token auto refresh feature, with no need to an Azure Active Directory admin increase the AccessTokenLifetime for users.

    This feature is also cited in a comment here: https://feedback.azure.com/forums/909463-azure-databricks/suggestions/36879865-enable-azure-ad-credential-passthrough-to-adls-gen

    2 votes
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Strong Feedback  ·  Flag idea as inappropriate…  ·  Admin →
  7. Security level when connecting Databricks to other Azure services

    Hello

    the [following page](https://docs.microsoft.com/fr-fr/azure/databricks/administration-guide/cloud-configurations/azure/vnet-inject) states :

    "VNet injection, enabling you to:
    Connect Azure Databricks to other Azure services (such as Azure Storage) in a more secure manner using service endpoints."

    Data Factory is now a 'Trusted Service' in Azure Storage and Azure Key Vault firewall, we can connect to those services as ‘Trusted Service’ using the Data Factory managed identity and the firewall settings ‘Allow trusted Microsoft Services…’.

    Could you please explain why using "service endpoints is more secure" that using 'Trusted Service'?

    Reference : [Data Factory is now a 'Trusted Service' in Azure Storage and Azure Key…

    1 vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  8. Disabling 'import library' options

    I want to disable all possible ways of installing libraries that covers init script, UI, REST api, and condo

    1 vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Strong Feedback  ·  Flag idea as inappropriate…  ·  Admin →
  9. Timestamp for each command

    It would be very helpful to see the exact timestamps for when a command started and finished processing, not only the runtime in msec.

    2 votes
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Strong Feedback  ·  Flag idea as inappropriate…  ·  Admin →
  10. Personal Access Token Security Improvements

    Most Databricks users end up needing to generate a Personal Access Token - which I am guessing is why Microsoft started to default that setting to ON.

    The problem is, from an Access Control perspective these tokens present a massive risk to any organization because there are no controls around them.

    These tokens allow direct access to everything the user has access to and all it takes to cause a major data breach is for one user to accidentally post one of these tokens on a public forum or GitHub.

    Here are a few specific issues:
    1. Even though conditional…

    1 vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Strong Feedback  ·  Flag idea as inappropriate…  ·  Admin →
  11. Azure Diagnosticks logs are collected with up to 24 hour delay, alert cannot be used

    As the doc says :
    On any given day, Azure Databricks delivers at least 99% of diagnostic logs within the first 24 hours, and the remaining 1% in no more than 72 hours.
    Refer : https://docs.microsoft.com/en-us/azure/databricks/administration-guide/account-settings/azure-diagnostic-logs#diagnostic-log-delivery

    In this case, if logs are sent to log analytcis, log search alert can not be used to monitior those logs due to the unpredictable delay . This has been posted by multiple customers, hope this can be enhanced

    4 votes
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Strong Feedback  ·  Flag idea as inappropriate…  ·  Admin →
  12. ARM templet with databricks within vnet can only be deployed once

    Cannot redeploy arm template with databricks with vnet integration. More details here: https://github.com/Azure/azure-quickstart-templates/issues/6670

    Actually I get the same error when issueing az network nsg create command for already existing nsg (used by databricks).

    1 vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  13. Azure databricks cluster creation via Terraform

    I need help to deploy Azure Databricks cluster using terraform template.
    I have tried for some parameter like " azurermdatabricksworkspace" i can able to launch databricks ,but having issue on databricks cluster creation and not able to find any module /argument for same.
    Please help me to fix this .

    1 vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  14. python to Access directly using the storage account access

    The docs on https://docs.microsoft.com/en-us/azure/databricks/data/data-sources/azure/azure-datalake-gen2 show an example of using a storage account access key in scala - is there an example in python?

    1 vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  15. NCv3 in North Europe

    Tesla K80 cards are super slow and lack mixed precision support, furthermore, they're incompatible with Nvidia RAPIDS, because they're very old.
    Google doesn't even give away K80 for free in colab unless all P100's are taken.
    Having only K80s available in north europe is outrageos.
    We've had to shuffle data and models back and forth between north and west to train models in west. This is becoming very cumbersome and wasting a lot of time for us.
    Please add contemporary GPUs in North Europe

    1 vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  16. Support for enqueuing job clusters onto an instance pool

    Using instance pools to optimize the runtime of smaller, automated jobs is currently at odds with the built-in scheduling system.

    This is because an instance pool will simply reject a job for which it can't immediately procure the required amount of nodes.

    This is a proposal to have an enqueuing behavior such that an automated job will instead wait (possibly with a configurable upper time limit) for resources to become available. The requirement would then be that either the minimum autoscaling configuration or the fixed number of nodes would be satisified at the time of job start.

    Having this functionality…

    1 vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Strong Feedback  ·  Flag idea as inappropriate…  ·  Admin →
  17. Active Directory authentication enabled on SQL connector (azure datawarehouse, sql server, etc. )

    The spark SQL connector in scala and python should be able to connect through active directory authentication.

    2 votes
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  18. Databricks Service principal per workspace for specific KeyVault access

    Databricks currently accesses KeyVault from the control plane and uses the same AzureDatabricks Service principal for ALL databricks workspaces in the tennant.

    At present, if you create a secret scope in workspace A on KeyVault A and a new secret scope in workspace B on KeyVault B then the Azure databricks service principal will have access to both keyvaults. Therefore, providing you are privielaged enough to know the details (resource uri) of the keyvaults then you can create a scope from your own databricks workspace C and get access to all the keys!!

    It should be possible to specify an…

    18 votes
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Strong Feedback  ·  Flag idea as inappropriate…  ·  Admin →
  19. Please add feature that can use Table access control when use "R"

    At now, we can use Table access control with python and SQL only .
    So, please add feature that can use Table access control when use "R".

    4 votes
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  20. Allow to categorize Azure Databricks costs by cluster name

    Allow to see how much each cluster are spending, so we can manage better the costs relative to certain activities

    3 votes
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
← Previous 1 3 4 5
  • Don't see your idea?

Azure Databricks

Categories

Feedback and Knowledge Base