Skip to content

Latest commit

 

History

History

101-data-factory-v2-blob-to-blob-copy

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 
 
 
 
 
 
 
 
 

Copy data from one folder to another folder in an Azure Blob Storage

Azure Public Test Date Azure Public Test Result

Azure US Gov Last Test Date Azure US Gov Last Test Result

Best Practice Check Cred Scan Check This template creates a data factory of version 2 with a pipeline that copies data from one folder to another in an Azure Blob Storage.

Here are a few important points about the template:

Deploy To Azure Visualize

When you deploy this Azure Resource Manager template, a data factory of version 2 is created with the following entities:

  • Azure Storage linked service
  • Azure Blob datasets (input and output)
  • Pipeline with a copy activity

To get the name of the data factory

  1. Click the Deployment succeeded message.
  2. Click Go to resource group.
  3. Search for ADFTutorialResourceGroup0927<unique string>

The following sections provide steps for running and monitoring the pipeline. For more information, see Quickstart: Create a data factory by using Azure PowerShell.

Run and monitor the pipeline

After you deploy the template, to run and monitor the pipeline, do the following steps:

  1. Download runmonitor.ps1 to a folder on your machine.

  2. Launch Azure PowerShell.

  3. Run the following command to log in to Azure.

    Login-AzureRmAccount
  4. Switch to the folder where you copied the script file.

  5. Run the following command to log in to Azure after specifying the names of your Azure resource group and the data factory.

    .\runmonitor.ps1 -resourceGroupName "<name of your resource group>" -DataFactoryName "<name of your data factory>"