About 8,500,000 results
Open links in new tab
  1. Printing secret value in Databricks - Stack Overflow

    Nov 11, 2021 · First, install the Databricks Python SDK and configure authentication per the docs here. pip install databricks-sdk Then you can use the approach below to print out secret …

  2. Databricks: managed tables vs. external tables - Stack Overflow

    Jun 21, 2024 · The decision to use managed table or external table depends on your use case and also the existing setup of your delta lake, framework code and workflows. Your …

  3. Converting SQL stored procedure into a Databricks Notebook: …

    Dec 5, 2023 · 0 I'm trying to convert a SQL stored procedure into a Databricks notebook. One stored procedure has multiple IF statements combined with BEGIN/END statements. Based …

  4. Create temp table in Azure Databricks and insert lots of rows

    Nov 28, 2022 · Create temp table in Azure Databricks and insert lots of rows Asked 2 years, 11 months ago Modified 10 months ago Viewed 25k times

  5. Databricks: How do I get path of current notebook?

    Databricks is smart and all, but how do you identify the path of your current notebook? The guide on the website does not help. It suggests: %scala dbutils.notebook.getContext.notebookPath …

  6. List databricks secret scope and find referred keyvault in azure ...

    Jun 23, 2022 · How can we find existing secret scopes in databricks workspace. And which keyvault is referred by specific SecretScope in Azure Databricks?

  7. REST API to query Databricks table - Stack Overflow

    Jul 24, 2022 · Is databricks designed for such use cases or is a better approach to copy this table (gold layer) in an operational database such as azure sql db after the transformations are done …

  8. how to get databricks job id at the run time - Stack Overflow

    Jun 9, 2025 · I am trying to get the job id and run id of a databricks job dynamically and keep it on in the table with below code run_id = self.spark.conf.get ("spark.databricks.job.runId", "no_ru...

  9. Need for volumes in Databricks - Stack Overflow

    Sep 24, 2024 · Why do we need Volumes when we can access the location using external locations? The doc says that it is to add governance, but we can already govern using external …

  10. How to to trigger a Databricks job from another Databricks job?

    Aug 14, 2023 · Databricks is now rolling out the new functionality, called "Job as a Task" that allows to trigger another job as a task in a workflow. Documentation isn't updated yet, but you …