Skip to content

Commit

Permalink
codegen onboard datafactory (Azure#1321)
Browse files Browse the repository at this point in the history
  • Loading branch information
qiaozha authored Jun 23, 2020
1 parent 8a801b9 commit 456cf12
Show file tree
Hide file tree
Showing 73 changed files with 97,832 additions and 0 deletions.
2 changes: 2 additions & 0 deletions .github/CODEOWNERS
Original file line number Diff line number Diff line change
Expand Up @@ -130,6 +130,8 @@

/src/costmanagement/ @haroldrandom

/src/datafactory/ @qiaozha

/src/blockchain/ @MyronFanQiu

/src/codespaces/ @derekbekoe
Expand Down
6 changes: 6 additions & 0 deletions scripts/ci/credscan/CredScanSuppressions.json
Original file line number Diff line number Diff line change
Expand Up @@ -72,6 +72,12 @@
"src\\eventgrid\\azext_eventgrid\\tests\\latest\\recordings\\test_Partner_scenarios.yaml"
],
"_justification": "Found General Symmetric Key"
},
{
"file": [
"src\\datafactory\\azext_datafactory\\vendored_sdks\\datafactory\\models\\_data_factory_management_client_enums.py"
],
"_justification": "Found General Symmetric Key"
}
]
}
8 changes: 8 additions & 0 deletions src/datafactory/HISTORY.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
.. :changelog:
Release History
===============

0.1.0
++++++
* Initial release.
220 changes: 220 additions & 0 deletions src/datafactory/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,220 @@
# Azure CLI Datafactory Extension #
This is a extension for datafactory features.

### How to use ###
Install this extension using the below CLI command
```
az extension add --name datafactory
```

### Included Features
#### Factory:
Manage a data factory: [more info](https://docs.microsoft.com/en-us/azure/data-factory/introduction)
*Examples:*
```
az datafactory factory create \
--location location \
--name factoryName \
--resource-group groupName
az datafactory factory update \
--name factoryName\
--tags exampleTag="exampleValue" \
--resource-group groupName
```

#### LinkedService:
Managed a linked service associated with the factory: [more info](https://docs.microsoft.com/en-us/azure/data-factory/concepts-linked-services)
*Examples:*
```
az datafactory linked-service create \
--factory-name factoryName \
--properties @{propertiesJsonPath} \
--name linkedServiceName \
--resource-group groupName
```

#### Dataset
Managed a view of the data that you want to use in data factory: [more info](https://docs.microsoft.com/en-us/azure/data-factory/concepts-datasets-linked-services)
*Examples:*
```
az datafactory dataset create \
--properties @{propertiesJsonPath}
--name datasetName \
--factory-name factoryName \
--resource-group groupName
```

#### Pipeline
Use pipeline to define a set of activities to operate on your dataset: [more info](https://docs.microsoft.com/en-us/azure/data-factory/concepts-pipelines-activities)
*Examples:*
```
az datafactory pipeline create \
--factory-name factoryName \
--pipeline @{pipelineJsonPath} \
--name pipelineName \
--resource-group groupName
az datafactory pipeline update \
--factory-name factoryName \
--activities @{activitiesJsonPath} \
--parameters @{parametersJsonPath} \
--run-dimensions @{runDimensionJsonPath} \
--variables @{variableJsonPath}
--name pipelineName \
--resource-group groupName
```

#### Pipeline-Run
You can manually execute your pipeline activities(on demand): [more info](https://docs.microsoft.com/en-us/azure/data-factory/concepts-pipeline-execution-triggers#manual-execution-on-demand)
*Examples:*
```
az datafactory pipeline create-run \
--factory-name factoryName \
--parameters @{parametersJsonPath} \ # parameters pass to the pipeline activities
--name pipelineName \
--resource-group groupName
```
In the create run step, you will get a pipeline runId. Now you can choose to cancel this execution
```
az datafactory pipeline-run cancel \
--factory-name factoryName \
--resource-group groupName \
--run-id runId
```
You can query the pipeline run by factory
```
az datafactory pipeline-run query-by-factory \
--factory-name factoryName \
--filters filterCondition \ # example:
operand="PipelineName" operator="Equals" values="myPipeline"
--last-updated-after queryStartTime \
--last-updated-before queryEndTime \
--resource-group groupName
```
You can also query the activities run by pipeline runId
```
az datafactory activity-run query-by-pipeline-run \
--factory-name factoryName \
--last-updated-after queryStartTime \
--last-updated-before queryEndTime \
--resource-group groupName \
--run-id runId
```

#### Trigger
Triggers are the other way that you can execute a pipeline run: [more info](https://docs.microsoft.com/en-us/azure/data-factory/concepts-pipeline-execution-triggers#trigger-execution)
*Examples:*
```
az datafactory trigger create \
--factory-name factoryName \
--resource-group groupName \
--properties @{propertiesJsonPath} \
--name triggerName
# start a trigger
az datafactory trigger start \
--factory-name factoryName \
--resource-group groupName \
--name triggerName
# stop a trigger
az datafactory trigger stop \
--factory-name factoryName \
--resource-group groupName \
--name triggerName
```

You can use the query trigger run and rerun a trigger run if needed.
```
az datafactory trigger-run query-by-factory \
--factory-name factoryName \
--filters filterCondition \
--last-updated-after queryStartTime \
--last-updated-before queryEndTime \
--resource-group groupName
# You will get a triggerRunId for each trigger run. If it's not in process status, you can rerun it. Please note that the rerun can only apply to Tumble Window Trigger.
az datafactory trigger-run rerun \
--factory-name factoryName \
--resource-group groupName \
--run-id triggerRunId \
--trigger-name triggerName
```

#### Integration-Runtime
The Integration-Runtime (IR) is the compute infrastructure used by data factory to provide the data integration capabilities: [more info](https://docs.microsoft.com/en-us/azure/data-factory/concepts-integration-runtime)
*Examples:*
```
az datafactory integration-runtime self-hosted create \
--factory-name factoryName \
--description description \
--name integrationRuntimeName \
--resource-group groupName
az datafactory integration-runtime managed create \
--factory-name factoryName \
--name integrationRuntimeName \
--resource-group groupName \
--description description \
--type-properties-compute-properties @{computePropertiesJsonPath} \
--type-properties-ssis-properties @{ssisPropertiesJsonPath}
az datafactory integration-runtime update \
--factory-name factoryName \
--name integrationRuntimeName \
--resource-group groupName \
--auto-update updateMode \
--update-delay-offset delayOffset
```
If it's a self-hosted IR, you need to go to the portal to create the integration runtime nodes, but you can perform the following operations on it.
```
az datafactory integration-runtime get-connection-info \
--factory-name factoryName \
--name integrationRuntimeName \
--resource-group groupName
az datafactory integration-runtime get-status \
--factory-name factoryName \
--name integrationRuntimeName \
--resource-group groupName
az datafactory integration-runtime list-auth-key \
--factory-name factoryName \
--name integrationRuntimeName \
--resource-group groupName
az datafactory integration-runtime regenerate-auth-key \
--factory-name factoryName \
--name integrationRuntimeName \
--key-name keyName \
--resource-group groupName
az datafactory integration-runtime sync-credentials \
--factory-name factoryName \
--name integrationRuntimeName \
--resource-group groupName
az datafactory integration-runtime upgrade \
--factory-name factoryName \
--name integrationRuntimeName \
--resource-group groupName
az datafactory integration-runtime-node get-ip-address \
--factory-name factoryName \
--integration-runtime-name integrationRuntimeName \
--node-name nodeName \
--resource-group groupName
```
If it's a managed IR, you can perform the following operations on it
```
az datafactory integration-runtime start \
--factory-name factoryName \
--name integrationRuntimeName \
--resource-group groupName
az datafactory integration-runtime stop \
--factory-name factoryName \
--name integrationRuntimeName \
--resource-group groupName
```
50 changes: 50 additions & 0 deletions src/datafactory/azext_datafactory/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,50 @@
# --------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for
# license information.
#
# Code generated by Microsoft (R) AutoRest Code Generator.
# Changes may cause incorrect behavior and will be lost if the code is
# regenerated.
# --------------------------------------------------------------------------

from azure.cli.core import AzCommandsLoader
from azext_datafactory.generated._help import helps # pylint: disable=unused-import
try:
from azext_datafactory.manual._help import helps # pylint: disable=reimported
except ImportError:
pass


class DataFactoryManagementClientCommandsLoader(AzCommandsLoader):

def __init__(self, cli_ctx=None):
from azure.cli.core.commands import CliCommandType
from azext_datafactory.generated._client_factory import cf_datafactory_cl
datafactory_custom = CliCommandType(
operations_tmpl='azext_datafactory.custom#{}',
client_factory=cf_datafactory_cl)
parent = super(DataFactoryManagementClientCommandsLoader, self)
parent.__init__(cli_ctx=cli_ctx, custom_command_type=datafactory_custom)

def load_command_table(self, args):
from azext_datafactory.generated.commands import load_command_table
load_command_table(self, args)
try:
from azext_datafactory.manual.commands import load_command_table as load_command_table_manual
load_command_table_manual(self, args)
except ImportError:
pass
return self.command_table

def load_arguments(self, command):
from azext_datafactory.generated._params import load_arguments
load_arguments(self, command)
try:
from azext_datafactory.manual._params import load_arguments as load_arguments_manual
load_arguments_manual(self, command)
except ImportError:
pass


COMMAND_LOADER_CLS = DataFactoryManagementClientCommandsLoader
17 changes: 17 additions & 0 deletions src/datafactory/azext_datafactory/action.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
# --------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for
# license information.
#
# Code generated by Microsoft (R) AutoRest Code Generator.
# Changes may cause incorrect behavior and will be lost if the code is
# regenerated.
# --------------------------------------------------------------------------
# pylint: disable=wildcard-import
# pylint: disable=unused-wildcard-import

from .generated.action import * # noqa: F403
try:
from .manual.action import * # noqa: F403
except ImportError:
pass
4 changes: 4 additions & 0 deletions src/datafactory/azext_datafactory/azext_metadata.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
{
"azext.isExperimental": true,
"azext.minCliCoreVersion": "2.3.1"
}
17 changes: 17 additions & 0 deletions src/datafactory/azext_datafactory/custom.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
# --------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for
# license information.
#
# Code generated by Microsoft (R) AutoRest Code Generator.
# Changes may cause incorrect behavior and will be lost if the code is
# regenerated.
# --------------------------------------------------------------------------
# pylint: disable=wildcard-import
# pylint: disable=unused-wildcard-import

from .generated.custom import * # noqa: F403
try:
from .manual.custom import * # noqa: F403
except ImportError:
pass
12 changes: 12 additions & 0 deletions src/datafactory/azext_datafactory/generated/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
# coding=utf-8
# --------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for
# license information.
#
# Code generated by Microsoft (R) AutoRest Code Generator.
# Changes may cause incorrect behavior and will be lost if the code is
# regenerated.
# --------------------------------------------------------------------------

__path__ = __import__('pkgutil').extend_path(__path__, __name__)
Loading

0 comments on commit 456cf12

Please sign in to comment.