Azure Synapse - Stage Files - Target environment

This article proposes a possible target environment on Azure Synapse for Microsoft Fabric Stage Files generator.

Installation and configuration of the target environment are not part of biGENIUS support.

Unfortunately, we won't be able to provide any help beyond this example in this article.

Many other configurations and installations are possible for a Spark target environment.

Below is a possible target environment setup on Azure Synapse for a Microsoft Fabric Stage Files generator.

The Property Target Platform should be set to the Azure Synapse value:

Setup environment

The Azure Synapse target environment needs at least the following Azure resources set in your Azure Subscription:

  • A Ressource Group: some help is available here
  • Inside the Ressource Group:
    • 2 Storage Accounts: some help is available here
      • 1 for the Source data
      • 1 for the Target Data Lake
    • An Apache Spark Pool: some help is available here
    • A Synapse Workspace: some help is available here

In the following example of the target environment, we will use the following resources:

  • A Ressource Group named bg-synapse
  • A Storage Account for our Source Data named bgsynapselandingzone1
  • A Storage Account for our Target Data Lake named bgsynapsedatalake1
  • An Apache Spark Pool named bgaasspark33v2
  • A Synapse Workspace named bgsynapseworkspace1

Tools to install

Please install the following tools:

  • Azure Storage Explorer: available here

Source Storage Account

Please upload the source Parquet files to the Source Storage Account:

  • Open Azure Storage Explorer
  • Connect to your Subscription
  • Open the Source Storage Account
  • Create one folder by Parquet Source file

The folder name should be identical to the Parquet file name but in uppercase.

For this example, we have 6 Parquet source files, so we need six folders:

Upload in each folder the corresponding Parquet Source file, for example, for the CreditCard folder:

Target Storage Account

We have chosen to create a folder named docu-datalake in our Target Storage Account:

  • Open Azure Storage Explorer
  • Connect to your Subscription
  • Open the Target Storage Account
  • Create a folder 

For this example, we have 1 Target folder for our Data Lake:

Upload Artifacts in Azure Synapse

Please now upload the generated Artifacts from the biGENIUS-X application to the Azure Synapse Workspace.

Please replace the placeholders before uploading the artifacts.

  • Click on the Synapse Workspace in Azure:
  • Then click on the Workspace web URL:
  • Azure Synapse Analytics is opened:
  • Click on the Develop menu on the left-hand-side:
  • If you are in the live mode of Synapse, change it to the Azure DevOps Git mode:
  • Create a new branch named demo_synapse:

  • We have chosen to create a folder named artifacts:

In the file 500_Deploy_and_Load_DataVault_Fabric.ipynb, adapt the name of the XXX_Deployment.ipynb, the XXX_SimpleLoadexecution.ipynb, the XXX_MultithreadingLoadExecution.ipynb, and the XXX_SelectResults.ipynb by the name of your Helper files.

  • Commit all in the demo_synapse branch:
  • Make a pull request from your branch demo_synapse to the collaboration branch:
  • Open the collaboration branch and click on the Publish button:

You're now ready to deploy these artifacts and subsequently load the data based on the Generator you are using with the following possible load controls: