-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Artifact representation in Pipeline #1003
Comments
A good way that also follows Argo would be to add the See Argo example on how the artifact inputs/outputs and artifact passing is described in Argo: https://github.com/argoproj/argo/blob/master/examples/artifact-passing.yaml Adding the artifact production support to the DSL compiler is easy. |
@animeshsingh would you mind giving us more context of your use cases about typed artifact object? We are in an effort to support artifact metadata (including location, type and runtime data) tracking in a metadata store. For consuming artifact inside a pipeline, we are still open to advice. |
@animeshsingh same as hongye, more context is welcome. Is this issue about defining custom artifact types or artifact passing, or both? Thanks |
…flow#1003) * Deprecate existing kubeflow repos presubmit This is the PR for deprecating existing kubeflow repository's presubmit depending on optional-test-infra. * Update config files
Are there any thoughts on how to define custom artifacts in pipelines which different steps can consume? For e.g. if we need to bring in a model as a typed artifact object, which a pipeline can consume, and it can be defined in DSL
The text was updated successfully, but these errors were encountered: