Update: Microsoft will be moving away from UserVoice sites on a product-by-product basis throughout the 2021 calendar year. We will leverage 1st party solutions for customer feedback. Learn more here.

Azure Databricks

Azure Databricks is an Apache Spark-based analytics platform optimized for the Microsoft Azure cloud services platform. Designed with the founders of Apache Spark, Databricks is integrated with Azure to provide one-click setup, streamlined workflows, and an interactive workspace that enables collaboration between data scientists, data engineers, and business analysts.

We would love to hear any feedback you have for Azure Databricks.
For more details about Azure Databricks, try our documentation page.

  1. Query Acceleration should support compressed data

    Query Acceleration currently supports csv and json files but not compressed versions of those files. It needs to support compressed data or it won't be usable for big data stores.

    1 vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Strong Feedback  ·  Flag idea as inappropriate…  ·  Admin →
  2. Notebook Isolation to R packages

    We are having a separated environment per Notebook as done in Python would help, the notebook could load all the libraries installed in the cluster by default but if users want to install/update libraries it would stay in the notebook environment. In R notebooks it’s not the case unfortunately. Let´s say Installing a package can introduce breaking changes in the code so imagine your code is not compatible with the library A version 2.0 but your colleagues force the installation of this version in his notebook then your notebook won’t run any more because he installed the version 2.0 not…

    1 vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Strong Feedback  ·  Flag idea as inappropriate…  ·  Admin →
  3. track the user who changes folder name

    It seems that Databricks can not track the user who changes the folder name.

    We need a feature in diagnostic logs/Activity logs to track this information

    1 vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Strong Feedback  ·  Flag idea as inappropriate…  ·  Admin →
  4. Automated configuration of Table Access Control

    Reviewing all the documentation and from what is part of the Terraform provider documentation I cannot see a way where one can automate the configuration of Table Access Control.

    Really everything within Databricks should be configurable via the API, CLI and therefore Terraform.

    It really sucks having to go into the Admin Console in the UI to change certain values.

    1 vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Strong Feedback  ·  Flag idea as inappropriate…  ·  Admin →
  5. Enable RStudio to use Credential Passthrough for ADLS

    RStudio on Azure Databricks is not currently compatible with clusters with credential passthrough enabled - instead authentication must be managed using either access keys or a service principal.

    Enabling credential passthrough would allow users to use the gold-standard in ADLS authentication.

    1 vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Strong Feedback  ·  Flag idea as inappropriate…  ·  Admin →
  6. House Moving Melbourne Business address: 1/11 Anthony drive Mount Waverley VIC, 3149 Australia Phone: 03 9059 1908

    Description:
    House Moving Melbourne is one the best furniture removals company in Melbourne. We are serving people from many years. We offer tailor-made secure storage and relocation services in Melbourne. Along with a dedicated and professional team of movers, we have numerous vans and trucks. Our specialized services include wrapping, packing, moving, storage and unpacking. We have 5000+ happy customers.
    Unlike traditional movers, we have everything for your assistance from the workforce to vehicles.
    https://housemovingmelbourne.com.au

    1 vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Strong Feedback  ·  Flag idea as inappropriate…  ·  Admin →
  7. Kindly give table schema download option

    In Data->Tables

    Kindly give table schema download option . Where we can have Columns and its datatypes in xls or csv

    1 vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Strong Feedback  ·  Flag idea as inappropriate…  ·  Admin →
  8. Request an API to collect the information of how long the active cluster is running

    We need an API to collect the information of how long the active cluster is running.

    1 vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Strong Feedback  ·  Flag idea as inappropriate…  ·  Admin →
  9. Tooltip not reflecting correctly

    While creating a Python job in databricks, the information in the box nearby Type and corresponding tooltip shows either to choose dfbs:/ or S3:/. Although it understandable but at first instance misleading as well.

    1 vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Strong Feedback  ·  Flag idea as inappropriate…  ·  Admin →
  10. Quality Quincy Appliance Repair

    Business Name:
    Quality Quincy Appliance Repair
    Address:
    36 School St #105 Quincy, MA 02169
    Phone Number:
    617-829-9934
    URL:
    https://www.appliancerepairquincy.com
    Description:
    Quality Quincy Appliance Repair is proud to provide washing machine repair, dishwasher repair, refrigerator repair and oven repair in Quincy, Massachusetts. Our customers can rely on us to arrive on-time, find the issue with the appliance ASAP and have all of the parts on hand to fix it that same appointment.
    Social Media Accounts:
    https://quality-quincy-appliance-repair.business.site
    https://goo.gl/maps/mGjgxpDYSWk

    1 vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Strong Feedback  ·  Flag idea as inappropriate…  ·  Admin →
  11. 1 vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Strong Feedback  ·  Flag idea as inappropriate…  ·  Admin →
  12. Instance pools should have a hybrid mode between on-demand and spot

    Currently instance pools can either be set to all on-demand or all spot instances.

    I would like the ability to have my job clusters/interactive clusters be able to choose to have on-demand instances for driver nodes and a set number of workers and then have the remaining workers be spot instances.

    I cannot currently set my driver nodes to be on-demand and my worker nodes to be spot when accessing them from a pool.

    If I use an interactive cluster I can (must?) set my driver node to be on-demand and my worker nodes to be spot.

    4 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Strong Feedback  ·  Flag idea as inappropriate…  ·  Admin →
  13. Enable ADLS passthrough and table access control together.

    Currently, in high concurrency cluster with ADLS passthrough enabled, table access control can't be enabled at the same time. If database/tables are created on top of storage containers and files in storage containers have ACL/RBAC applied, it works fine and users can't access underlying data if RBAC/ACL on the data files don't allow the access. If a user tries to run a select query on a table with underlying data in a container the user doesn’t have access to, the select query returns an error as expected.

    Users can still view the tables and drop tables and databases not created…

    3 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Strong Feedback  ·  Flag idea as inappropriate…  ·  Admin →
  14. 1 vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Strong Feedback  ·  Flag idea as inappropriate…  ·  Admin →
  15. Include Cluster Policies Permissions to force users to use a policy

    Currently we are only allowed to give permission to users to use a policy or not, but I think it would be useful to allow admins to force users creating interactive clusters to use a specific policy for security purposes or other reasons.

    1 vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Strong Feedback  ·  Flag idea as inappropriate…  ·  Admin →
  16. mlflow package on R for Machine Learning Runtimes

    mlflow should be part of the R packages in the ML runtimes but it's not. And perhaps the mlflow_install() should already have been run/prepared

    1 vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Strong Feedback  ·  Flag idea as inappropriate…  ·  Admin →
  17. Suggestion with a brief explanation

    I was wondering if there was any way to see the required syntax and/or documentation of a method. For example, typing count( I would like to be able to let's say press shift+tab for example and see a popup window where the count() function will be explained as well as the needed parameters if case there is any. Similar to what exists in other IDE.

    1 vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Strong Feedback  ·  Flag idea as inappropriate…  ·  Admin →
  18. Integration with Azure virtual machines

    This is the scenario we faced with a client. Client has massive data(numerous files) on a virtual machine hosted in Azure. To process these files(classify whether they have PII), we decided to use Azure Databricks to run the code. We setup a file share in the windows VM and tried to mount the file share in databricks from the terminal. Now we tried virtual network peering to connect the two networks(windows VM n/w and databricks n/w). This failed, we were unable to ping databricks from windows VM, the pings reached only up to the default gateway of databricks. We thought…

    1 vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Strong Feedback  ·  Flag idea as inappropriate…  ·  Admin →
  19. metadata catalog at schema, table, column level

    we need to capture "business metadata" at the column level
    * Description
    * Internal product family
    * etc.

    The attributes (Description, Internal product family, etc.) should be programmable, i.e. we these should be fields that can be defined by the end user.

    This is particularly important for AT&T having > 90M columns of data across the company, and we would like to document the data in place, vs. via spreadsheets afterwards.

    The only way to do this today is to self define a schema to capture column attributes in the "comments field" , which is clumsy and not directly queryable.

    1 vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Strong Feedback  ·  Flag idea as inappropriate…  ·  Admin →
  20. Support Databricks Git Integration with Azure DevOps linked to a different AD

    According to the official docs, in order to enable Git in Databricks:

    The Azure DevOps Services organization must be linked to the same Azure AD tenant as Databricks.

    This is extremely limiting as Databricks workspaces are often deployed under client AD tenants.

    5 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Strong Feedback  ·  Flag idea as inappropriate…  ·  Admin →
← Previous 1 3 4 5
  • Don't see your idea?

Azure Databricks

Categories

Feedback and Knowledge Base