Azure Databricks

Azure Databricks is an Apache Spark-based analytics platform optimized for the Microsoft Azure cloud services platform. Designed with the founders of Apache Spark, Databricks is integrated with Azure to provide one-click setup, streamlined workflows, and an interactive workspace that enables collaboration between data scientists, data engineers, and business analysts.

We would love to hear any feedback you have for Azure Databricks.
For more details about Azure Databricks, try our documentation page.

  1. Connection to Azure Data Lake Analytics Tables/views

    We have business data objects (tables / views) created on ADLA and we want to call them from Databricks can we have a connection same as ADLS.

    2 votes
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  2. Data Bricks should be able to be integrated with Service Now to analyze past product problems

    DataBricks can be integrated to streaming data and product master data to get the master data as well as performance data of the device. But it should be able to integrate with Service now to query and auto suggest problems analyzing the telemetry of the device to suggest that there is a problem and suggest solution from past incidents.

    2 votes
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  3. Provide support for mrsdeploy on Azure databricks

    I wish to deploy R model to Azure ML server using mrsdeploy from Azure databricks

    2 votes
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  4. Let Notebook attach n run to a terminated clusters while another cluster is running

    I connect my notebook to a cluster that is currently running and run some commands. Then I realize that I connected to the wrong cluster. So I attach it to another cluster that is currently shutdown and then run the command again. I expect that the shutdown cluster will auto start and will run my command but instead it tells me that there is another cluster that is currently running and the newly attached cluster is shutdown. It asks if it should attach to the active cluster and run the command there. The two options for this question is "Cancel"…

    1 vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  5. Multiple Cron schedules for one job

    Many simple schedules cannot be configured for a single job in Databricks due to the limitations of cron schedules. E.g. running a job every 40 minutes. Multiple schedules could provide such a frequency, but today, that would require having duplicate copies of the same job with different schedules.

    1 vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Strong Feedback  ·  Flag idea as inappropriate…  ·  Admin →
  6. Moving multiple cells up/down together

    In jupyter notebook, you are able to select multiple cells and easily move them up/down together. This functionality is not possible right now in databricks.

    1 vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  7. Enable Azure AD credential passthrough to ADLS Gen2 from PowerBI

    At present, the PowerBI connector uses token authentication. It would be ideal if this used AD auth, and that auth was passed down to the underlying source (Data lake gen2).

    This is currently only available within the workspace using High Concurrency clusters. But we would like non-technical users to use PowerBI

    1 vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Strong Feedback  ·  Flag idea as inappropriate…  ·  Admin →
  8. Unable to install azure-eventhub PYPI package

    Hi,

    Tried to install azure-eventhub PYPI package in databricks cluster but ended up in error

    Could not find a version that satisfies the requirement azure-eventhub (from versions: )
    No matching distribution found for azure-eventhub

    Does databricks have restrictions in installing libraries from PYPI? I could see the package in PYPI but why databricks is not able to find it

    1 vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  9. Add diagnostic logging configuration to ARM template

    It is possible to automatically add control plane diagnostic logging from Azure Databricks to Log Analytics. It is not possible to automate this process and must be done through the Portal. It would be great if this could be included in the ARM template or via Powershell.

    1 vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  10. runtime versions

    Databricks needs to better test the compatibility between different runtime versions and the various packages.

    1 vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Strong Feedback  ·  Flag idea as inappropriate…  ·  Admin →
  11. Support importing workspace libraries in the REST API

    Workspaces can contain notebooks, folders, and libraries. However, the REST API only support importing notebooks and .dac folders into a workspace. This feature request is to support importing libraries (e.g. .whl files) into a workspace.

    1 vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  12. backquotes do not work as documented for SQL

    This documentation example, from: https://docs.microsoft.com/en-us/azure/databricks/getting-started/spark/dataframes#run-sql-queries

    select State Code, 2015 median sales price from data_geo

    Fails on

    SELECT CpuUtilization Average FROM t09c0e51a

    as does this:

    asparquet = asparquet.withColumnRenamed("CpuUtilization Average", "CpuUtilizationAverage")
    as_parquet.createOrReplaceTempView('t09c0e51a')
    sqlContext.sql("SELECT CpuUtilizationAverage FROM t09c0e51a").take(1)

    1
    asparquet = asparquet.withColumnRenamed("CpuUtilization Average", "CpuUtilizationAverage")
    2
    as_parquet.createOrReplaceTempView('t09c0e51a')
    3
    sqlContext.sql("SELECT CpuUtilizationAverage FROM t09c0e51a").take(1)

    org.apache.spark.sql.AnalysisException: Attribute name "CpuUtilization Average" contains invalid character(s) among " ,;{}()\n\t=". Please use alias to rename it.;

    1 vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  13. Tags should be inherited from the deployment resource group

    We've been trying to utilize tags in our environment to identify resources for billing purposes. We've put a policy in place that forces new resources to inherit this tag from its resource group if it wasn't given one explicitly. Most Azure resources are handled fine, but we've discovered that the tags don't get inherited by the resources created by databricks in the managed resource group. Instead, we have to individually assign the tags to each Databricks cluster. I'd like to see new clusters that inherit the tags of either the Managed Resource Group or the Databricks service itself.

    1 vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  14. job creation automation

    Is there any way to create ADB job in automated way. Currently I have to create jobs manually in all the environments.

    1 vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  15. Can we access Event hub in Databricks using service principal

    Event hub document shows that we can access it using service principal and also I see that, we can access event hub using service principal using python libraries.

    Can we access Event hub in Databricks using service principal instead of SAS

    1 vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  16. Security level when connecting Databricks to other Azure services

    Hello

    the [following page](https://docs.microsoft.com/fr-fr/azure/databricks/administration-guide/cloud-configurations/azure/vnet-inject) states :

    "VNet injection, enabling you to:
    Connect Azure Databricks to other Azure services (such as Azure Storage) in a more secure manner using service endpoints."

    Data Factory is now a 'Trusted Service' in Azure Storage and Azure Key Vault firewall, we can connect to those services as ‘Trusted Service’ using the Data Factory managed identity and the firewall settings ‘Allow trusted Microsoft Services…’.

    Could you please explain why using "service endpoints is more secure" that using 'Trusted Service'?

    Reference : [Data Factory is now a 'Trusted Service' in Azure Storage and Azure Key…

    1 vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  17. Disabling 'import library' options

    I want to disable all possible ways of installing libraries that covers init script, UI, REST api, and condo

    1 vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Strong Feedback  ·  Flag idea as inappropriate…  ·  Admin →
  18. Personal Access Token Security Improvements

    Most Databricks users end up needing to generate a Personal Access Token - which I am guessing is why Microsoft started to default that setting to ON.

    The problem is, from an Access Control perspective these tokens present a massive risk to any organization because there are no controls around them.

    These tokens allow direct access to everything the user has access to and all it takes to cause a major data breach is for one user to accidentally post one of these tokens on a public forum or GitHub.

    Here are a few specific issues:
    1. Even though conditional…

    1 vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Strong Feedback  ·  Flag idea as inappropriate…  ·  Admin →
  19. Azure databricks cluster creation via Terraform

    I need help to deploy Azure Databricks cluster using terraform template.
    I have tried for some parameter like " azurermdatabricksworkspace" i can able to launch databricks ,but having issue on databricks cluster creation and not able to find any module /argument for same.
    Please help me to fix this .

    1 vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  20. python to Access directly using the storage account access

    The docs on https://docs.microsoft.com/en-us/azure/databricks/data/data-sources/azure/azure-datalake-gen2 show an example of using a storage account access key in scala - is there an example in python?

    1 vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  • Don't see your idea?

Azure Databricks

Categories

Feedback and Knowledge Base