-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
UI says "Successfully created new Run", but it isn't in the list #308
Comments
I am not sure I followed the description. |
Resolving as it has been idle for several months. |
Linchin
pushed a commit
to Linchin/pipelines
that referenced
this issue
Apr 11, 2023
…ng cluster (kubeflow#308) * add init.sh * Move repo checkout to init.sh * Update Dockerfile to use init.sh * Change image pull path * Add args to deploy config * Add PYTHONPATH * temp disable deployment deletion * add comment * update * disable concurrency * temp clone from gabrielwen repo * fix * init commit to snapshot_kf_deployment * add a few flags to snapshot_kf_deployment * create issue for deployment permission and deletion script * use command in yaml instead of args + ENTRYPOINT * change tag to live * move snapshot script to separate folder * Add Pipfile for snapshot worker * First commit of Dockerfile * Add Pipfile.lock * temp yaml for testing * read github token * change api version * add restartpolicy * parse args * fix * test some args * updates * remove PyGithub * finish snapshot sha * Add args: output_path * Finish script * add doc * Move config to deploy-cron.yaml * remove single branch checkout * revert cron time * remove env cmd * fix pylint errors. * Add checkout-snapshot.sh * finish script * add repo-clone-snapshot to init.sh * add some args * write deployment_metadata.json * add args to deployment_metadata * Add deployment_metadata to create_kf_instance * Make changes for testing * remove nodejs deps * mount downward API * move snapshot-kf-deployment to deploy-cron * remove unused * mount nfs-external * add NFS to snapshot-kf-deployment.py * Add filelock and NFS file writes * add dir_lock * updat lock_and_write * remove bucket * update repo_clone_snapshot and python module * resume git clone process * Add cluster_num * Add cluster_num as args * Add args to workflows.sh * updates * updates * Add cluster_num to create_kf_instance * remove unused dep * remove initContainer * add doc * remove test file * clean up deps * temp config * clean up deps * add doc * remove temp configs * Add docs * remove temps * add some temp configs * temp configs * fix typo * add get_timestamp * add labels to create_kf_instance * add args * delete for now * fix * test * updates * fix * revert * prepare args * fix * fix * fix * fix * fix * update labels * add zone as args * replace not valid characters * fix * fix logs * remove double quote * add loop * remove deprecated codes * add googleapiclient for labels * remove loop * temp * fix googleapisclient usage * fix * inspect * inspect * inspect * use kwargs * add exec * change to container API * add field selector * add invalid * catch httperror * syntax error * indent error * fix * add import googleapiclient * add args * pick cluster_num * fix * test * hook up cluster_num to file * add doc * temp * remove temp codes * fix * merge * add temp config * add temp config * Add delete_deployment to create_kf_instance * temp config * fix * add doc * remove temp configs * update max_cluster_num * fix lint * change to use named args * temp config * temp * GOOGLE_APPLICATION_CREDENTIALS to env in container * rename * remove temp * add doc * change unreachable logic from N/A to unused * change to 8 hour spec
HumairAK
pushed a commit
to red-hat-data-services/data-science-pipelines
that referenced
this issue
Mar 11, 2024
* fix outputfile formatting * ignore python 3.5 test with unsortable characters * Add python 3.5 error comments * lint tailing whitespace
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
We need to make the UX and Backend synchronized.
Backend probably needs to verify the uploaded pipelines. E.g. using argo lint.
The text was updated successfully, but these errors were encountered: