Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

No way in the UI to upload a pipeline from the cluster itself #495

Closed
lakshmanok opened this issue Dec 6, 2018 · 6 comments
Closed

No way in the UI to upload a pipeline from the cluster itself #495

lakshmanok opened this issue Dec 6, 2018 · 6 comments
Assignees

Comments

@lakshmanok
Copy link
Contributor

So, I used the Jupyter notebook to create a pipeline.tar.gz. Unfortunately, there is no way to upload it using the UI.

(I can of course do it from Python, but the UI needs to allow uploading a file from the cluster).

@yebrahim
Copy link
Contributor

yebrahim commented Dec 7, 2018

I get the feeling of disconnect between the notebook and our UI, but I don't think there's much we can do there unfortunately. The two UIs runs in separate pods with potentially different authorization roles.

I think a better approach would be exploring adding a custom Jupyter widget to interact with the Pipelines backend. This can handle uploading pipelines, submitting runs, and perhaps even linking the current open notebook with a run on the backend, and providing a link to see it in the Pipelines UI. I can see this become very handy.

@lakshmanok
Copy link
Contributor Author

A quicker option might be to allow the user to specify a gs://.../ or http://.../ URL to load the pipeline from. This would also be useful to load directly from an AI Hub listing.

@yebrahim
Copy link
Contributor

yebrahim commented Dec 8, 2018

Agreed, we're planning to add an "import by url" feature in the upload dialog.

@yebrahim
Copy link
Contributor

/assign @rileyjbauer

#554 adds ability to import by url. Closing this since it's not tracking anything else. We can continue the discussion if @lakshmanok feels otherwise.

/close

@k8s-ci-robot
Copy link
Contributor

@yebrahim: Closing this issue.

In response to this:

/assign @rileyjbauer

#554 adds ability to import by url. Closing this since it's not tracking anything else. We can continue the discussion if @lakshmanok feels otherwise.

/close

Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes/test-infra repository.

1 similar comment
@k8s-ci-robot
Copy link
Contributor

@yebrahim: Closing this issue.

In response to this:

/assign @rileyjbauer

#554 adds ability to import by url. Closing this since it's not tracking anything else. We can continue the discussion if @lakshmanok feels otherwise.

/close

Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes/test-infra repository.

Linchin pushed a commit to Linchin/pipelines that referenced this issue Apr 11, 2023
* Auto deploy job needs to use the new kfctl syntax; also use unique names

Related to kubeflow#471

* Don't set name in the spec because we want to infer it form directory.

* Create a new script to deploy with a unique name

* Related to: kubeflow/testing#444

* Update cleanup script to clean up new auto-deployed clusters

* In cron job get code from master.

* Fix lint.

* Revert changes to create_kf_instance

* update to v1beta1 spec.

* * We need to use a self-signed certificate with the auto-deployed clusters
  because otherwise we hit lets-encrypt rate limiting.
HumairAK pushed a commit to red-hat-data-services/data-science-pipelines that referenced this issue Mar 11, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants