Skip to content
  • There are no suggestions because the search field is empty.

Databricks Unity - Deployment - 1.9

Before deploying the Generated Artifacts, please:

You should now have the Notebook files inside Databricks ready for deployment.

Deploy the generated Artifacts

We will explain how to deploy the generated artifacts using the possible target environment Databricks Unity.

To deploy the generated Artifacts:

  • Open Databricks with the URL provided in the Azure Databricks Service resource:
  • Databricks is open:
  • Click on the Workspace menu on the left-hand side:
  • Expand the folders Workspace > Users > Your_user > artifacts:
  • Open the 500_Deploy_and_Load_Databricks.ipynb file, which contains four steps:
    1. %run ./LoadControl/XXX_Deployment_LoadControl: Deploy the load control structure
    2. %run ./Jupyter/XXX_Deployment.: Deploy the code
    3. %run ./LoadControl/XXXX_MultiThreadingLoadExecution.ipynb: Load the data with multiple threads
    4. %run ./XXX_SelectResults.ipynb: Display the results
  • Execute step 1, then step 2


  • Select the Personal Compute created during the target environment setup:

  • Click on the Attach and run button:

  • The deployment is done:
    • Open Azure Storage Explorer
    • You should have a new folder docu_stage created in your Target folder inside the Target Storage Account:
    • Inside this folder, a Unity storage path and 1 folder per Target Model Object were created:
    • Each Target Model Object folder contains a folder delta_log with the needed files to be able to load the date later:

You can now load the data with the following load controls: