Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update images, bug fixes, clean up code #1778

Merged
merged 4 commits into from
Aug 9, 2019
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Next Next commit
Update docker images and minor refactoring
  • Loading branch information
carolynwang committed Aug 8, 2019
commit ce1ce460e55cd31803ebe741a495d93120704f0a
2 changes: 1 addition & 1 deletion components/aws/sagemaker/batch_transform/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -71,7 +71,7 @@ outputs:
- {name: output_location, description: 'S3 URI of the transform job results.'}
implementation:
container:
image: carowang/kubeflow-pipeline-aws-sm:20190801-01
image: carowang/kubeflow-pipeline-aws-sm:latest
carolynwang marked this conversation as resolved.
Show resolved Hide resolved
command: ['python']
args: [
batch_transform.py,
Expand Down
6 changes: 3 additions & 3 deletions components/aws/sagemaker/common/_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -80,7 +80,7 @@ def create_training_job_request(args):
request['AlgorithmSpecification']['TrainingImage'] = args['image']
request['AlgorithmSpecification'].pop('AlgorithmName')
else:
# TODO: determine if users can make custom algorithm resources that have the same name as built-in algorithm names
# TODO: Adjust this implementation to account for custom algorithm resources names that are the same as built-in algorithm names
algo_name = args['algorithm_name'].lower().strip()
if algo_name in built_in_algos.keys():
request['AlgorithmSpecification']['TrainingImage'] = get_image_uri(args['region'], built_in_algos[algo_name])
Expand Down Expand Up @@ -114,7 +114,7 @@ def create_training_job_request(args):
request['InputDataConfig'] = args['channels']
# Max number of input channels/data locations is 20, but currently only 8 data location parameters are exposed separately.
# Source: Input data configuration description in the SageMaker create training job form
for i in range(1, len(args['channels'] + 1)):
for i in range(1, len(args['channels']) + 1):
if args['data_location_' + str(i)]:
request['InputDataConfig'][i-1]['DataSource']['S3DataSource']['S3Uri'] = args['data_location_' + str(i)]
else:
Expand Down Expand Up @@ -514,7 +514,7 @@ def create_hyperparameter_tuning_job_request(args):
request['TrainingJobDefinition']['InputDataConfig'] = args['channels']
# Max number of input channels/data locations is 20, but currently only 8 data location parameters are exposed separately.
# Source: Input data configuration description in the SageMaker create hyperparameter tuning job form
for i in range(1, len(args['channels'] + 1):
for i in range(1, len(args['channels']) + 1):
if args['data_location_' + str(i)]:
request['InputDataConfig'][i-1]['DataSource']['S3DataSource']['S3Uri'] = args['data_location_' + str(i)]
else:
Expand Down
2 changes: 1 addition & 1 deletion components/aws/sagemaker/deploy/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -76,7 +76,7 @@ outputs:
- {name: endpoint_name, description: 'Endpoint name'}
implementation:
container:
image: carowang/kubeflow-pipeline-aws-sm:20190801-01
image: carowang/kubeflow-pipeline-aws-sm:latest
command: ['python']
args: [
deploy.py,
Expand Down
2 changes: 1 addition & 1 deletion components/aws/sagemaker/ground_truth/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -85,7 +85,7 @@ outputs:
- {name: active_learning_model_arn, description: 'The ARN for the most recent Amazon SageMaker model trained as part of automated data labeling.'}
implementation:
container:
image: carowang/kubeflow-pipeline-aws-sm:20190801-01
image: carowang/kubeflow-pipeline-aws-sm:latest
command: ['python']
args: [
ground_truth.py,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -125,7 +125,7 @@ outputs:
description: 'The registry path of the Docker image that contains the training algorithm'
implementation:
container:
image: carowang/kubeflow-pipeline-aws-sm:20190801-01
image: carowang/kubeflow-pipeline-aws-sm:latest
command: ['python']
args: [
hyperparameter_tuning.py,
Expand Down
2 changes: 1 addition & 1 deletion components/aws/sagemaker/model/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,7 @@ outputs:
- {name: model_name, description: 'The model name Sagemaker created'}
implementation:
container:
image: carowang/kubeflow-pipeline-aws-sm:20190801-01
image: carowang/kubeflow-pipeline-aws-sm:latest
command: ['python']
args: [
create_model.py,
Expand Down
2 changes: 1 addition & 1 deletion components/aws/sagemaker/train/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -91,7 +91,7 @@ outputs:
- {name: training_image, description: 'The registry path of the Docker image that contains the training algorithm'}
implementation:
container:
image: carowang/kubeflow-pipeline-aws-sm:20190801-01
image: carowang/kubeflow-pipeline-aws-sm:latest
command: ['python']
args: [
train.py,
Expand Down
2 changes: 1 addition & 1 deletion components/aws/sagemaker/workteam/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ outputs:
- {name: workteam_arn, description: 'The ARN of the workteam.'}
implementation:
container:
image: carowang/kubeflow-pipeline-aws-sm:20190801-01
image: carowang/kubeflow-pipeline-aws-sm:latest
command: ['python']
args: [
workteam.py,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -6,9 +6,9 @@
from kfp import gcp
from kfp.aws import use_aws_secret

emr_create_cluster_op = components.load_component_from_file('../../../components/aws/emr/create_cluster/component.yaml')
emr_submit_spark_job_op = components.load_component_from_file('../../../components/aws/emr/submit_spark_job/component.yaml')
emr_delete_cluster_op = components.load_component_from_file('../../../components/aws/emr/delete_cluster/component.yaml')
emr_create_cluster_op = components.load_component_from_file('../../../../components/aws/emr/create_cluster/component.yaml')
emr_submit_spark_job_op = components.load_component_from_file('../../../../components/aws/emr/submit_spark_job/component.yaml')
emr_delete_cluster_op = components.load_component_from_file('../../../../components/aws/emr/delete_cluster/component.yaml')

@dsl.pipeline(
name='Titanic Suvival Prediction Pipeline',
Expand Down