Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

cherry-pick from kubeflow/pipelines#10550 #33

Merged
merged 1 commit into from
Apr 16, 2024

Conversation

rimolive
Copy link

@rimolive rimolive commented Apr 9, 2024

Description of your changes:
Cherry pick from upstream.

fix(Backend + SDK): Add missing optional field to SecretAsVolume and ConfigMapAsVolume.

Testing instructions:

  • Deploy all components
  • Check if all pods are up and running
  • As a bonus, run iris-pipeline and see if it finishes successfuly

Checklist:

kubeflow#10550)

* fix(Backend + SDK): Add missing optional field to SecretAsVolume and ConfigMapAsVolume.

Signed-off-by: Revital Sur <eres@il.ibm.com>

* Update after rebase.

Signed-off-by: Revital Sur <eres@il.ibm.com>

* Update after rebase.

Signed-off-by: Revital Sur <eres@il.ibm.com>

* Update after merge.

Signed-off-by: Revital Sur <eres@il.ibm.com>

* Updates after merge with master branch.

Signed-off-by: Revital Sur <eres@il.ibm.com>

---------

Signed-off-by: Revital Sur <eres@il.ibm.com>
@dsp-developers
Copy link

A set of new images have been built to help with testing out this PR:
API Server: quay.io/opendatahub/ds-pipelines-api-server:pr-33
DSP DRIVER: quay.io/opendatahub/ds-pipelines-driver:pr-33
DSP LAUNCHER: quay.io/opendatahub/ds-pipelines-launcher:pr-33
Persistence Agent: quay.io/opendatahub/ds-pipelines-persistenceagent:pr-33
Scheduled Workflow Manager: quay.io/opendatahub/ds-pipelines-scheduledworkflow:pr-33
MLMD Server: quay.io/opendatahub/mlmd-grpc-server:latest
MLMD Envoy Proxy: registry.redhat.io/openshift-service-mesh/proxyv2-rhel8:2.3.9-2
UI: quay.io/opendatahub/ds-pipelines-frontend:pr-33

@dsp-developers
Copy link

An OCP cluster where you are logged in as cluster admin is required.

The Data Science Pipelines team recommends testing this using the Data Science Pipelines Operator. Check here for more information on using the DSPO.

To use and deploy a DSP stack with these images (assuming the DSPO is deployed), first save the following YAML to a file named dspa.pr-33.yaml:

apiVersion: datasciencepipelinesapplications.opendatahub.io/v1alpha1
kind: DataSciencePipelinesApplication
metadata:
  name: pr-33
spec:
  dspVersion: v2
  apiServer:
    image: "quay.io/opendatahub/ds-pipelines-api-server:pr-33"
    argoDriverImage: "quay.io/opendatahub/ds-pipelines-driver:pr-33"
    argoLauncherImage: "quay.io/opendatahub/ds-pipelines-launcher:pr-33"
  persistenceAgent:
    image: "quay.io/opendatahub/ds-pipelines-persistenceagent:pr-33"
  scheduledWorkflow:
    image: "quay.io/opendatahub/ds-pipelines-scheduledworkflow:pr-33"
  mlmd:  
    deploy: true  # Optional component
    grpc:
      image: "quay.io/opendatahub/mlmd-grpc-server:latest"
    envoy:
      image: "registry.redhat.io/openshift-service-mesh/proxyv2-rhel8:2.3.9-2"
  mlpipelineUI:
    deploy: true  # Optional component 
    image: "quay.io/opendatahub/ds-pipelines-frontend:pr-33"
  objectStorage:
    minio:
      deploy: true
      image: 'quay.io/opendatahub/minio:RELEASE.2019-08-14T20-37-41Z-license-compliance'

Then run the following:

cd $(mktemp -d)
git clone git@github.com:opendatahub-io/data-science-pipelines.git
cd data-science-pipelines/
git fetch origin pull/33/head
git checkout -b pullrequest d834f774f3104d8dc3345ada7bf47bd50ddc5431
oc apply -f dspa.pr-33.yaml

More instructions here on how to deploy and test a Data Science Pipelines Application.

@rimolive rimolive changed the title UPSTREAM: <carry>: feat: cherry-pick from kubeflow/pipelines#10550 cherry-pick from kubeflow/pipelines#10550 Apr 11, 2024
@dsp-developers
Copy link

A set of new images have been built to help with testing out this PR:
API Server: quay.io/opendatahub/ds-pipelines-api-server:pr-33
DSP DRIVER: quay.io/opendatahub/ds-pipelines-driver:pr-33
DSP LAUNCHER: quay.io/opendatahub/ds-pipelines-launcher:pr-33
Persistence Agent: quay.io/opendatahub/ds-pipelines-persistenceagent:pr-33
Scheduled Workflow Manager: quay.io/opendatahub/ds-pipelines-scheduledworkflow:pr-33
MLMD Server: quay.io/opendatahub/mlmd-grpc-server:latest
MLMD Envoy Proxy: registry.redhat.io/openshift-service-mesh/proxyv2-rhel8:2.3.9-2
UI: quay.io/opendatahub/ds-pipelines-frontend:pr-33

@dsp-developers
Copy link

An OCP cluster where you are logged in as cluster admin is required.

The Data Science Pipelines team recommends testing this using the Data Science Pipelines Operator. Check here for more information on using the DSPO.

To use and deploy a DSP stack with these images (assuming the DSPO is deployed), first save the following YAML to a file named dspa.pr-33.yaml:

apiVersion: datasciencepipelinesapplications.opendatahub.io/v1alpha1
kind: DataSciencePipelinesApplication
metadata:
  name: pr-33
spec:
  dspVersion: v2
  apiServer:
    image: "quay.io/opendatahub/ds-pipelines-api-server:pr-33"
    argoDriverImage: "quay.io/opendatahub/ds-pipelines-driver:pr-33"
    argoLauncherImage: "quay.io/opendatahub/ds-pipelines-launcher:pr-33"
  persistenceAgent:
    image: "quay.io/opendatahub/ds-pipelines-persistenceagent:pr-33"
  scheduledWorkflow:
    image: "quay.io/opendatahub/ds-pipelines-scheduledworkflow:pr-33"
  mlmd:  
    deploy: true  # Optional component
    grpc:
      image: "quay.io/opendatahub/mlmd-grpc-server:latest"
    envoy:
      image: "registry.redhat.io/openshift-service-mesh/proxyv2-rhel8:2.3.9-2"
  mlpipelineUI:
    deploy: true  # Optional component 
    image: "quay.io/opendatahub/ds-pipelines-frontend:pr-33"
  objectStorage:
    minio:
      deploy: true
      image: 'quay.io/opendatahub/minio:RELEASE.2019-08-14T20-37-41Z-license-compliance'

Then run the following:

cd $(mktemp -d)
git clone git@github.com:opendatahub-io/data-science-pipelines.git
cd data-science-pipelines/
git fetch origin pull/33/head
git checkout -b pullrequest d834f774f3104d8dc3345ada7bf47bd50ddc5431
oc apply -f dspa.pr-33.yaml

More instructions here on how to deploy and test a Data Science Pipelines Application.

Copy link

@amadhusu amadhusu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Awesome! Everything works as it should.

  • Check if all pods are up and running

Screenshot from 2024-04-13 02-38-09

  • As a bonus, run iris-pipeline and see if it finishes successfully

Screenshot from 2024-04-13 02-38-24

Copy link
Member

@DharmitD DharmitD left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Deployed the changes, made sure resources came up correctly.
Compared changes to the mentioned PR and ensured commit is cherry picked correctly.

/lgtm

@HumairAK
Copy link

/approve

Copy link

openshift-ci bot commented Apr 16, 2024

[APPROVALNOTIFIER] This PR is APPROVED

This pull-request has been approved by: amadhusu, HumairAK

The full list of commands accepted by this bot can be found here.

The pull request process is described here

Needs approval from an approver in each of these files:

Approvers can indicate their approval by writing /approve in a comment
Approvers can cancel approval by writing /approve cancel in a comment

@openshift-merge-bot openshift-merge-bot bot merged commit 3efe0f0 into opendatahub-io:master Apr 16, 2024
3 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants