Tag: adf

Make sure you use base64 encoding on a Key Vault Secret for an sFTP Azure Data Factory Linked Service Connection

A quick post on setting up an sFTP Linked Service Connection in Azure Data Factory such that it uses a Key Vault for the SSH Key.

A friend of mine had tried setting this up but was getting the following error when testing the new Linked Service:

“Invalid Sftp credential provided for ‘SshPublicKey’ authentication type. The input is not a valid Base-64 string as it contains a non-base 64 characer, more than two padding characters, or an illegal character among the padding characters.”

The Linked Service was using a Key Vault to obtain the SSH Key to be used in the connection. The SSH Key had been uploaded as a Secret to the Key Vault using code similar to the following:

az keyvault secret set --name sshkey --vault-name akv-dev --file test.ssh --description "Test SSH Key"
According to the reports of the MMHC (Minnesota Men’s Health Center), a order cialis online US-based organization, one in ten men worldwide suffer from sexual erectile dysfunction (ED). Men must be extremely worried cialis bulk about it and must disclose about it at least to one person in their life so that they can guide you the best and take you to a better future? The answer is simple, but difficult to maintain. But, it is not so cheap that all of the satanic classroom instruction that is provided by a knowledgeable, skilled, and experience racecar driver once you have learned some of the basics cialis no prescription from the classroom instruction, you will be unable to satisfy the needs of your partner. Let them cool down and then pharmacy australia cialis strain it.

After reading through the documentation on the az keyvault secret set call I noticed this:

So, the default is not base64 but utf-8.

We modified the az call to something like this:

az keyvault secret set --name sshkey --vault-name akv-dev --file test.ssh --description "Test SSH Key" --encoding base64

i.e. with the addition of the –encoding base64 part and then it worked fine.

Configure Linked Templates Use for Azure Data Factory with Azure DevOps Deployment

I’ve finally worked out how to do this so I thought I’d write a post on it since I can’t find any single resource that accurately covers it all – my apologies, in advance, if someone has already done this.

I’ve been using Azure Data Factory v2 for quite a while now and have it integrated with Azure DevOps for CI/CD between environments. I follow the standard approach which is documented here and I won’t repeat.

I’ll assume that you have a git enabled source Data Factory and a non git enabled target Data Factory and that your main code branch is “master” and the publish branch is “adf_publish”.

As the documentation there says:

“If you’ve configured Git, the linked templates are generated and saved alongside the full Resource Manager templates in the adf_publish branch in a new folder called linkedTemplates”

…that happens when you publish the master branch from Azure Data Factory.

We can see the non linked template files (ARMTemplateForFactory.json, ARMTemplateParametersForFactory.json) and linked template files (ArmTemplate_master.json, ArmTemplateParameters_master.json and ARMTemplate_0.json) in the picture below:

Note – this is a very small demonstration factory and there is only one linked template file (ArmTemplate_0.json) – as the factory grows in size additional, consecutively numbered, files will appear.

The question then is how do you get Azure DevOps to use those Linked Template files instead of the non linked ones sitting in the adf_publish branch root directory?

Supposedly you can just follow this link but unfortunately that document is a little out of date and no longer being updated. The document covers the deployment of a VNET with Network Security Group and makes no mention of Azure Data Factory but it still provides some useful pointers.

The document correctly points out that in order for the linked templates to be deployed they need to be accessible to Azure Resource Manager and the easiest way of doing that is by having the files in an Azure Storage Account – that article illustrates the use of “Storage (general purpose v1)” but I used a Gen 2 ADLS Storage Account instead and that worked fine.

My Gen 2 Storage Account “oramossadls2” looks like this:


I then created a Shared Access Signature for oramossadls2 Storage Account and copied the SAS token which I then put into a secret called StorageSASToken in an Azure Key Vault called akv-dev:

I created an Access Policy on this Key Vault to allow the Azure DevOps Service Principal to be able to read the Secret:

On my Gen 2 ADLS Storage Account I then created a Container called demo:

From the container properties the URL looks like:

https://oramossadls2.blob.core.windows.net/demo 

In Azure Storage Explorer, I grant access to the container to the Service Principal of my Azure DevOps site in order that it can access the files:

In Azure DevOps I then created a Variable Group called akv-dev which brings in the StorageSASToken Secret from the akv-dev Azure Key Vault:

I created a second Variable Group called “Production-Static” in which I created some more variables for use later on:

Following this article from Kamil Nowinski I have a Build Pipeline “ADF-CI” in Azure DevOps which stores the ARM template files as artifacts ready for use on a Release.

Now for the bit that took me a while to work out…the Release Pipeline.

My release pipeline has the artifacts from the latest Pipeline Build (_ADF-CI) and the code from the “master” branch as artifacts and a single stage with five tasks:

The Production-Static and akv-dev Variable Groups are attached to the Pipeline:

The five tasks are:

The first step copies the files on the latest Build, attached as an artifact (_ADF-CI) to this Pipeline, to the ADLS Gen2 Storage Account that I’ve created:

The value of StorageAccountName from the “Production-Static” Variable Group is “oramossadls2” and blobContainerName is “demo”.

The YAML for step 1 is:

steps:
 task: AzureFileCopy@3
 displayName: 'AzureBlob File Copy'
 inputs:
 SourcePath: '$(System.DefaultWorkingDirectory)/_ADF-CI'
 azureSubscription: 'Pay-As-You-Go (xxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx)'
 Destination: AzureBlob
 storage: '$(StorageAccountName)'
 ContainerName: '$(blobContainerName)'
 BlobPrefix: adf 

When this step eventually runs the Storage Account will look like this:


The second step stops the ADF Triggers – if you don’t stop active triggers the deployment can fail. I use a Powershell script to do this:

The YAML for step 2 looks like this:

steps:
 task: AzurePowerShell@4
 displayName: 'Stop ADF Triggers'
 inputs:
 azureSubscription: 'Pay-As-You-Go (xxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx)'
 ScriptPath: '$(System.DefaultWorkingDirectory)/_ADF/powershell/SetADFTriggersState.ps1'
 ScriptArguments: '-DataFactoryName $(DataFactoryName) -DataFactoryResourceGroupName $(DataFactoryResourceGroupName) -State "Stop" -ReleaseIdentifier $(Release.Artifacts._ADF.BuildId)'
 azurePowerShellVersion: LatestVersion 

The third step actually deploys the template to the target environment (in this case oramoss-prod Data Factory resource group “adf-prod-rg”):

It took a while to work this out. First I tried using Template Location of “Linked Artifact” which allows you just to select the template/parameter file from the attached artifacts but that doesn’t work because “nested templates ALWAYS have to be deployed from url” according to this. So, we have to set Template Location to “URL of the file”. We then have to specify the Template and Template Parameter file link using a URL which consists of the Primary Blob Service Endpoint, the container, Blob Prefix, folder hierarchy, filename and the SAS Storage Key, i.e.

Template File:

https://oramossadls2.blob.core.windows.net/demo/adf/drop/linkedTemplates/ArmTemplate_master.json$(StorageSASToken)

Parameters File:

https://oramossadls2.blob.core.windows.net/demo/adf/drop/linkedTemplates/ArmTemplateParameters_master.json$(StorageSASToken)

Note – we are getting the Storage SAS Token from the attached Variable group akv-dev.

We then have to override some parameters:

-factoryName "oramoss-prod" -containerUri $(AzureBlobStorageURL)/$(blobContainerName)/adf/drop/linkedTemplates -containerSasToken $(StorageSASToken)

The factoryName needs to be overridden because we are moving the code from one Data Factory to the next.

In this article, it suggests that the parameter for the the URI of the template files is called “templateBaseUrl” but this appears to now be changed to “containerUri” and we set it to the Primary Blob Service Endpoint, Container Name and directory where the template files are held.

The article also suggest the Storage SAS Token parameter is called “SASToken” but it appears to now be “containerSasToken”.

The YAML for step 3 looks like:

steps:
 task: AzureResourceManagerTemplateDeployment@3
 displayName: 'Deploy ADF ARM Template'
 inputs:
 azureResourceManagerConnection: 'Pay-As-You-Go (xxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx)'
 subscriptionId: 'xxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx'
 resourceGroupName: 'adf-prod-rg'
 location: 'North Europe'
 templateLocation: 'URL of the file'
 csmFileLink: 'https://oramossadls2.blob.core.windows.net/demo/adf/drop/linkedTemplates/ArmTemplate_master.json$(StorageSASToken)'
 csmParametersFileLink: 'https://oramossadls2.blob.core.windows.net/demo/adf/drop/linkedTemplates/ArmTemplateParameters_master.json$(StorageSASToken)'
 overrideParameters: '-factoryName "oramoss-prod" -containerUri $(AzureBlobStorageURL)/$(blobContainerName)/adf/drop/linkedTemplates -containerSasToken $(StorageSASToken)'
 deploymentName: 'oramoss-prod-deploy' 

The fourth step removes orphaned resources. Because we use an incremental approach to pushing the ARM template to oramoss-prod it means that if we dropped an element from oramoss-dev Data Factory it would not automatically get removed from oramoss-prod Data Factory, i.e. the element would be orphaned in oramoss-prod. This step removes any such elements it finds using a Powershell script.

The YAML for step 4 looks like:

steps:
 task: AzurePowerShell@4
 displayName: 'Remove Orphans'
 inputs:
 azureSubscription: 'Pay-As-You-Go (xxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx)'
 ScriptPath: '$(System.DefaultWorkingDirectory)/_ADF/powershell/RemoveOrphanedADFResources.ps1'
 ScriptArguments: '-DataFactoryName $(DataFactoryName) -DataFactoryResourceGroupName $(DataFactoryResourceGroupName) -armTemplate $(System.DefaultWorkingDirectory)/_ADF-CI/drop/ARMTemplateForFactory.json'
 azurePowerShellVersion: LatestVersion 

The last step runs the same script as the second step but with State set to “StartPriorEnabled” instead of “Stop”.

The YAML for step 5 looks like:

steps:
 task: AzurePowerShell@4
 displayName: 'Start ADF Triggers'
 inputs:
 azureSubscription: 'Pay-As-You-Go (xxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx)'
 ScriptPath: '$(System.DefaultWorkingDirectory)/_ADF/powershell/SetADFTriggersState.ps1'
 ScriptArguments: '-DataFactoryName $(DataFactoryName) -DataFactoryResourceGroupName $(DataFactoryResourceGroupName) -State "StartPriorEnabled" -ReleaseIdentifier $(Release.Artifacts._ADF.BuildId)'
 azurePowerShellVersion: LatestVersion 

That’s it. Save the Release Pipeline and run it and the output should look like similar to:

Helpful Links

Safed Musli known as Divya Aushad in the ayurvedic medicine is gaining an increasing popularity according to its properties cipla tadalafil as cure for diabetes, arthritis pre-natal and post-natal problems. order viagra from india Besides, men can consume NF Cure capsule and Shilajit capsule provide better effect regarding this issue. Finally patients suffering from low blood pressure, feeling with more energy in the late afternoon hours, difficulty in getting out of bed and abrupt loss of the erection of the penile region cialis side effects during the acts of making love. This condition or inability to bear a child due 5mg cialis online to various reasons is known as infertility.