-
Notifications
You must be signed in to change notification settings - Fork 10
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
cherry-pick from kubeflow/pipelines#10550 #33
Conversation
kubeflow#10550) * fix(Backend + SDK): Add missing optional field to SecretAsVolume and ConfigMapAsVolume. Signed-off-by: Revital Sur <eres@il.ibm.com> * Update after rebase. Signed-off-by: Revital Sur <eres@il.ibm.com> * Update after rebase. Signed-off-by: Revital Sur <eres@il.ibm.com> * Update after merge. Signed-off-by: Revital Sur <eres@il.ibm.com> * Updates after merge with master branch. Signed-off-by: Revital Sur <eres@il.ibm.com> --------- Signed-off-by: Revital Sur <eres@il.ibm.com>
A set of new images have been built to help with testing out this PR: |
An OCP cluster where you are logged in as cluster admin is required. The Data Science Pipelines team recommends testing this using the Data Science Pipelines Operator. Check here for more information on using the DSPO. To use and deploy a DSP stack with these images (assuming the DSPO is deployed), first save the following YAML to a file named apiVersion: datasciencepipelinesapplications.opendatahub.io/v1alpha1
kind: DataSciencePipelinesApplication
metadata:
name: pr-33
spec:
dspVersion: v2
apiServer:
image: "quay.io/opendatahub/ds-pipelines-api-server:pr-33"
argoDriverImage: "quay.io/opendatahub/ds-pipelines-driver:pr-33"
argoLauncherImage: "quay.io/opendatahub/ds-pipelines-launcher:pr-33"
persistenceAgent:
image: "quay.io/opendatahub/ds-pipelines-persistenceagent:pr-33"
scheduledWorkflow:
image: "quay.io/opendatahub/ds-pipelines-scheduledworkflow:pr-33"
mlmd:
deploy: true # Optional component
grpc:
image: "quay.io/opendatahub/mlmd-grpc-server:latest"
envoy:
image: "registry.redhat.io/openshift-service-mesh/proxyv2-rhel8:2.3.9-2"
mlpipelineUI:
deploy: true # Optional component
image: "quay.io/opendatahub/ds-pipelines-frontend:pr-33"
objectStorage:
minio:
deploy: true
image: 'quay.io/opendatahub/minio:RELEASE.2019-08-14T20-37-41Z-license-compliance' Then run the following: cd $(mktemp -d)
git clone git@github.com:opendatahub-io/data-science-pipelines.git
cd data-science-pipelines/
git fetch origin pull/33/head
git checkout -b pullrequest d834f774f3104d8dc3345ada7bf47bd50ddc5431
oc apply -f dspa.pr-33.yaml More instructions here on how to deploy and test a Data Science Pipelines Application. |
A set of new images have been built to help with testing out this PR: |
An OCP cluster where you are logged in as cluster admin is required. The Data Science Pipelines team recommends testing this using the Data Science Pipelines Operator. Check here for more information on using the DSPO. To use and deploy a DSP stack with these images (assuming the DSPO is deployed), first save the following YAML to a file named apiVersion: datasciencepipelinesapplications.opendatahub.io/v1alpha1
kind: DataSciencePipelinesApplication
metadata:
name: pr-33
spec:
dspVersion: v2
apiServer:
image: "quay.io/opendatahub/ds-pipelines-api-server:pr-33"
argoDriverImage: "quay.io/opendatahub/ds-pipelines-driver:pr-33"
argoLauncherImage: "quay.io/opendatahub/ds-pipelines-launcher:pr-33"
persistenceAgent:
image: "quay.io/opendatahub/ds-pipelines-persistenceagent:pr-33"
scheduledWorkflow:
image: "quay.io/opendatahub/ds-pipelines-scheduledworkflow:pr-33"
mlmd:
deploy: true # Optional component
grpc:
image: "quay.io/opendatahub/mlmd-grpc-server:latest"
envoy:
image: "registry.redhat.io/openshift-service-mesh/proxyv2-rhel8:2.3.9-2"
mlpipelineUI:
deploy: true # Optional component
image: "quay.io/opendatahub/ds-pipelines-frontend:pr-33"
objectStorage:
minio:
deploy: true
image: 'quay.io/opendatahub/minio:RELEASE.2019-08-14T20-37-41Z-license-compliance' Then run the following: cd $(mktemp -d)
git clone git@github.com:opendatahub-io/data-science-pipelines.git
cd data-science-pipelines/
git fetch origin pull/33/head
git checkout -b pullrequest d834f774f3104d8dc3345ada7bf47bd50ddc5431
oc apply -f dspa.pr-33.yaml More instructions here on how to deploy and test a Data Science Pipelines Application. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Deployed the changes, made sure resources came up correctly.
Compared changes to the mentioned PR and ensured commit is cherry picked correctly.
/lgtm
/approve |
[APPROVALNOTIFIER] This PR is APPROVED This pull-request has been approved by: amadhusu, HumairAK The full list of commands accepted by this bot can be found here. The pull request process is described here
Needs approval from an approver in each of these files:
Approvers can indicate their approval by writing |
3efe0f0
into
opendatahub-io:master
Description of your changes:
Cherry pick from upstream.
fix(Backend + SDK): Add missing optional field to SecretAsVolume and ConfigMapAsVolume.
Testing instructions:
Checklist: