Azure Databricks

Azure Databricks is an Apache Spark-based analytics platform optimized for the Microsoft Azure cloud services platform. Designed with the founders of Apache Spark, Databricks is integrated with Azure to provide one-click setup, streamlined workflows, and an interactive workspace that enables collaboration between data scientists, data engineers, and business analysts.

We would love to hear any feedback you have for Azure Databricks.
For more details about Azure Databricks, try our documentation page.

  1. New table renders are much more difficult to read

    Whatever update was just made to change how tables show up in Databricks notebook has been a big hit to quality of life. The text in the tables seem smaller and less bold - it is far more difficult for me to read now.

    1 vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Strong Feedback  ·  Flag idea as inappropriate…  ·  Admin →
  2. Stop charging for Azure Databricks 14-Day free trial

    I received a large bill from Microsoft for the month of June, 2020 where I have been charged $1,096.40 . As I did more cost analysis from the Azure portal I found I have been charged for azure databricks trial.
    I followed microsoft documentation to learn about databricks as in this section: "Create an Azure Databricks workspace and cluster" from this page: "https://docs.microsoft.com/en-us/learn/modules/describe-azure-databricks/3-create-workspace-cluster" . As mentioned in this documentation I took microsoft azure databricks free trial subscription for 14 days as stated in the above page : "Pricing Tier: Trial (Premium - 14 days Free DBUs). You must…

    1 vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  3. H2O Flow or H2O Flow Equivalent in Azure

    The H2O Flow GUI works with Databricks AWS but not Azure. It auto populates a huge amount of interactive charts and metrics for post model evaluation via a local host URL. I would like to request H2O Flow be enabled in Azure, but further request that Databricks add many more interactive auto generated content like H2O Flow does. This includes interactive ROC curves where you can traverse the confusion matrix by selecting any point on the curve, cross validation data set score, and variable importance.

    1 vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  4. Allow tag modification in the managed resource group

    We have noticed that every managed resource group of every Databricks workspace has a deny assignment that prevents from modifying the RG's tags, after the creation. This often cause problems with governance and monitoring of the managed resources. Its purpose doesn't seem to be documented anywhere, and it doesn't seem like it can be bypassed.

    1 vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  5. allow sandboxes to allow new users to follow this through

    allow sandboxes to allow new users to follow this through

    1 vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  6. Improvements needed in Databricks diagnostics settings

    Improvements needed in Databricks diagnostics settings,

    Log analytics table name -DatabricksJobs

    1.Include the Job name in the logs
    2.Include job run time duration in logs

    Log analytics table name -DatabricksClusters

    1.With respect to Databricks clusters, the logs are populated only if there’s an operation performed.
    2.To know the availability status of the cluster, it would helpful to have some kind of an availability log entry once every 5 minutes or once a minute.
    3.Along with the availability log, it would also be helpful to know the number of nodes that are currently running in the cluster. The existing log entry…

    1 vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Strong Feedback  ·  Flag idea as inappropriate…  ·  Admin →
  7. a button to clear all failed commands listing on the side

    a button to clear all failed commands listing on the side

    1 vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  8. Support library/wheel override

    I have one databrick cluster setup to run jobs, each time I submit a job, I specify the wheel file in the libraries arguments pass to Databrick. It's only work the first time. The following jobs will receive message like following:
    20/06/16 02:47:05 WARN SparkContext: The jar /local_disk0/tmp/AI..-2.0.0-py3-none-any.whl has been added already. Overwriting of added jars is not supported in the current version.

    Since I often need to modify above AI..-2.0.0-py3-none-any.whl, it will be great to allow for overwriting library with same name for each job execution

    1 vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  9. Generalize AuthN passthrough for specific AuthZ endpoints

    Currently the credential passthrough feature/s offer limited utility.

    Aim: generalize the feature so that an AD authenticated user (presumably with bearer token) can access a wide variety of Azure service end-points with great admin configurability.

    Most end points should be part of the regular service with some needing premium subscription.

    There are many such endpoints. The current one we're encountering is the Azure Artifacts Feed end-point. With the ability to configure automatic AuthZ (bearer token -> access token) we get per-notebook installations of packages with per-user credentialing. This appears to align well with V7ML's addition of %pip and %conda magic…

    1 vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  10. Microsoft.DataBricks/workspaces/parameters.storageAccountName does not work.

    Microsoft.DataBricks/workspaces/parameters.storageAccountName is not working. Can you please bring this feature back? We like to keep our naming conventions standards.

    1 vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  11. canceling a command could show the output of the last run

    It could be nice if canceling a command before it starts to execute - will show the output of the last run - as sometimes i forget to save it somewhere and then i accidentally run the command again and the output is cleared.

    1 vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  12. Query all table ACLs for a principal, database etc.

    it would be really helpful to be able to retrieve all the explicit grants that a principal has

    at present you can:
    show grant <principal> on <data_object>

    in order to be able to view all the permissions a user has it would be good to:
    show grant <principal>
    or
    show grant <principal> on *
    or
    show all grants for <principal>

    and a case where you would like to get ACLs of DB and all objects beneath like:

    show grant on DATABASE recursive

    1 vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  13. Add cluster with RStudio installed the ability to record and store inactivity logs and also work with the auto-terminate option enabled

    Add cluster with RStudio installed the ability to record and store inactivity logs and also work with the auto-terminate option enabled.

    1 vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Strong Feedback  ·  Flag idea as inappropriate…  ·  Admin →
  14. Official PowerShell module for Azure Databricks

    An official PowerShell module for Databricks management would be very useful. There are some community created modules which are not being updated on time. Since PowerShell can run on both Windows and Linux this could be universal tool for interacting with API which could reduce the development time for all CI/CD tasks.

    1 vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  15. 1 vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  16. Revision history disappears

    There seems to be a "perfect" window size where the revision history option disappears. When the window is small you get the small history icon; larger windows get the full button. But just between the 2 it disappears entirely.

    1 vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Strong Feedback  ·  Flag idea as inappropriate…  ·  Admin →
  17. Allow accessing secret versions when using key vault backed scope

    Currently the dbutils.secret.get method only accepts the name of a secret scope and a secret. As Azure Key vault has inbuilt secret version handling using version IDs, it would be useful to add an additional input to this such as 'version' for key vault backed scopes so a specific version of a secret can be accessed.

    Example:

    dbutils.secret.get("kvscope", "secretname", "6a1d96ead4b3422d83eb52d582005c8a")

    1 vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Strong Feedback  ·  Flag idea as inappropriate…  ·  Admin →
  18. Multiple cluster with same name/configurations shouldn't be allowed

    Currently Databricks creates multiple cluster even the name and configurations are same.

    There should a validation mechanism to not allow cluster with same name if configurations are also same

    1 vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Strong Feedback  ·  Flag idea as inappropriate…  ·  Admin →
  19. Heiner Marketing Solutions

    20403 Encino Ledge #592691, San Antonio, TX 78259

    210-405-4499

    http://heinermarketingsolutions.com/

    info@heinermarketingsolutions.com

    We are a digital marketing company that specializes in helping local small business owners with their marketing needs by generating leads online. If you are looking for an SEO specialist who wants your business to grow, we are here to help make that dream a reality. We are here to help you dominate your local market. This is the next step to invest in your business.

    Business Hours M-F 8-5pm

    1 vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  20. Storage Injection

    Storage Injection

    This feature request is for storage injection, whereby you can deploy a workspace's storage resources (e.g. DBFS) into your own Azure Storage account.

    Natural next step from VNet injection

    Storage injection is the natural next step from VNet injection, which Azure Databricks already supports. Just as VNet injection enables enterprise network customization by allowing you to deploy data plane network resources into your own Azure Virtual Network, storage injection would enable enterprise storage customization by allowing you to deploy data plane storage resources into your own Azure Storage Account.

    Large customers really want it

    Large customers--like…

    1 vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Strong Feedback  ·  Flag idea as inappropriate…  ·  Admin →
  • Don't see your idea?

Azure Databricks

Categories

Feedback and Knowledge Base