-
Notifications
You must be signed in to change notification settings - Fork 70
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Automate compile version & upload to S3 #179
Conversation
Initial idea coming from readthedocs/readthedocs-ops#1155 showing how we can maybe automate this process.
Avoid uploading the .tar.gz to production S3 since we will be managing this via CircleCI and an AWS orbs. See readthedocs/readthedocs-docker-images#179 We need to keep the upload part for development environment, tho.
It seems I'm pretty close to having something working already. There are some of the jobs that succeed and I see the artifacts uploaded into We can polish the script a little more, but we need to keep in mind that we need the script to keep being compatible for local development since we need to upload the artifacts to MinIO when working locally. |
This could end up in weird production states. For example when some check passes and some others fail. In that case, we will be updating only some versions to production. Note that this already happened while testing this PR. |
Another thing that I don't have solved yet is how to make |
I decided to avoid using Now, I'm using a similar approach to the one commented in https://github.com/readthedocs/readthedocs-ops/issues/1155#issuecomment-1082615972 that:
This makes better usage of resources and should be way faster. The downside, it's bashy 😄 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks great! This seems much better than the process we had prior.
Last couple pieces look to be just switching from your branches to main
. Safe to do that now?
@agjohnson done! We should merge readthedocs/readthedocs.org#9098 first and then this branch. After that, all the .tar.gz will be uploaded to S3 -dev buckets. Then we can update the environment variables with -prod buckets and re-trigger the CircleCI job for the final test. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🤖📦🛫
Failure is just on docs. Imma merge this. |
Bah, wrong PR somehow. Oops. |
* Build: do not upload `build.tool` to production S3 Avoid uploading the .tar.gz to production S3 since we will be managing this via CircleCI and an AWS orbs. See readthedocs/readthedocs-docker-images#179 We need to keep the upload part for development environment, tho. * Build: join TOOL and VERSION with a `-` to simplify CircleCI routine * Do not delete .tar.gz from local host when on production * Remove old documentation We are not going to run this command in production by hand anymore. However, in case it's required, I'm linking to the issue that explains how to do it. * Skip uploading when running from inside CIRCLECI * Split docker iamge name from full to just keep the name without date * Revert "Split docker iamge name from full to just keep the name without date" This reverts commit e08a7de. * Receive arguments as previously * Comment about pinning transitive dependencies * Build Python 3.6.15 using `clang`
I cancelled the running build (which had pulled down script from Looks good! |
Initial idea coming from
https://github.com/readthedocs/readthedocs-ops/issues/1155 showing how we can
maybe automate this process.