Before deploying the Generated Artifacts, please:
- Generate the Artifacts
- Download the generated Artifacts
- Replace the placeholders in the Generated Artifacts
You should now have the Notebook files inside Databricks ready for deployment.
Deploy the generated Artifacts
We will explain how to deploy the generated artifacts using the possible target environment Databricks Unity .
To deploy the generated Artifacts:
- Open Databricks with the URL provided in the Azure Databricks Service resource:
- Databricks is opened:
- Click on the Workspace menu on the left-hand-side:
- Expand the folders Workspace > Users > Your_user > artifacts:
- Open the 500_Deploy_and_Load_DataVault_Databricks.ipynb file, which contains four steps:
- %run ./XXX_Deployment.ipynb: Deploy the code
- %run XXXX_SimpleLoadExecution.ipynb: Load the data in a single thread
- %run XXXX_MultiThreadingLoadExecution.ipynb: Load the data with multiple threads
- %run ./XXX_SelectResults.ipynb: Display the results
- Execute step 1
- Select the Personal Compute created during the target environment setup:
- Click on the Attach and run button:
- The deployment is done:
- Open Azure Storage Explorer
- You should have a new folder docu_stage created in your Target folder inside the Target Storage Account:
- Inside this folder was created a Unity storage path and 1 folder per Target Model Object:
- Each Target Model Object folder contains a folder delta_log with the needed files to be able to load the date later:
You can now load the data with the following load controls: