Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Migrate away from google.com gcp project kubernetes-ci-images #1460

Closed
spiffxp opened this issue Dec 3, 2020 · 15 comments
Closed

Migrate away from google.com gcp project kubernetes-ci-images #1460

spiffxp opened this issue Dec 3, 2020 · 15 comments
Assignees
Labels
area/artifacts Issues or PRs related to the hosting of release artifacts for subprojects area/release-eng Issues or PRs related to the Release Engineering subproject priority/important-soon Must be staffed and worked on either currently, or very soon, ideally in time for the next release. sig/cluster-lifecycle Categorizes an issue or PR as relevant to SIG Cluster Lifecycle. sig/release Categorizes an issue or PR as relevant to SIG Release. sig/testing Categorizes an issue or PR as relevant to SIG Testing.
Milestone

Comments

@spiffxp
Copy link
Member

spiffxp commented Dec 3, 2020

Part of umbrella issue to migrate away from google.com gcp projects: #1469

Part of umbrella to migrate kubernetes e2e test images/registries to community-owned infrastructure: #1458

Pulling this out of kubernetes/test-infra#19483 which covered running ci-kubernetes-build on community-owned infrastructure

The community-owned job now writes artifacts to:

  • gs://k8s-release-dev instead of gs://kubernetes-release-dev
  • gcr.io/k8s-staging-ci-images instead of gcr.io/kubernetes-ci-images

We need to migrate away from gcr.io/kubernetes-ci-images, which will involve:

  • announce deprecation of gcr.io/kubernetes-ci-images
  • migrating jobs/subprojects to use gcr.io/k8s-staging-ci-images instead

During the deprecation window, we will need to either:

  • continue running a google.com job that publishes to gcr.io/kubernetes-ci-images
  • setup a job that syncs from gcr.io/k8s-staging-ci-images to gcr.io/kubernetes-ci-images

Once the window has passed, we can:

  • delete the old job
  • delete the old project

Potential impact (ref: kubernetes/test-infra#19483 (comment))

  • anyone downstream who consumes gcr.io/kubernetes-ci-images
  • clusters using kubeadm without a custom image repository and with a ci/foo version
    • may impact cluster-api and its implementations
    • may impact kinder

I'm starting this as an umbrella issue here. It may require a KEP, or it may be better homed in kubernetes/kubeadm.

@spiffxp
Copy link
Member Author

spiffxp commented Dec 3, 2020

Per kubernetes/test-infra#19483 (comment)

@spiffxp
Copy link
Member Author

spiffxp commented Dec 3, 2020

/area artifacts
/area release-eng
/sig release
/sig testing
/sig cluster-lifecycle
/wg k8s-infra

@k8s-ci-robot k8s-ci-robot added area/artifacts Issues or PRs related to the hosting of release artifacts for subprojects area/release-eng Issues or PRs related to the Release Engineering subproject sig/release Categorizes an issue or PR as relevant to SIG Release. sig/testing Categorizes an issue or PR as relevant to SIG Testing. sig/cluster-lifecycle Categorizes an issue or PR as relevant to SIG Cluster Lifecycle. wg/k8s-infra labels Dec 3, 2020
@spiffxp
Copy link
Member Author

spiffxp commented Jan 21, 2021

/milestone v1.21
/assign @spiffxp @neolit123

@neolit123
Copy link
Member

just logged a cluster-api ticket (req changes appear to be minimal and non-blocking):
kubernetes-sigs/cluster-api#4103

for kubeadm, the GCR change is done:
kubernetes/kubeadm#2356
(and cherry picked to the support skew)

just merged the GCS change for kinder too:
kubernetes/kubeadm#2355

@spiffxp spiffxp added the priority/important-soon Must be staffed and worked on either currently, or very soon, ideally in time for the next release. label Jan 22, 2021
@neolit123
Copy link
Member

it seems we aren't publishing the latest artifacts to the new GCS?
kubernetes/kubeadm#2380 (comment)

looks like the problem is the new bucket we switched to here.

https://console.cloud.google.com/storage/browser/_details/k8s-release-dev/ci/latest-1.20.txt resolves to v1.20.0-beta.2.96+98bc258bf5516b

the latest version reported by https://dl.k8s.io/ci/latest-1.20.txt is the valid one:
v1.20.3-rc.0.15+18194169ac684f
but this version and the artifacts are missing in the new bucket.

@spiffxp
Copy link
Member Author

spiffxp commented Feb 24, 2021

I did a quick comparison of build parity between kubernetes-release-dev (which currently backs dl.k8s.io) and k8s-release-dev here: kubernetes/test-infra#20964 (comment)

There are times where one of the build jobs flakes, and the other doesn't. If the flake happens on the job writing to gs://k8s-release-dev, then you may occasionally hit this if you're pulling from dl.k8s.io

@BobyMCbobs
Copy link
Member

BobyMCbobs commented Mar 29, 2021

There appears to be a few missing images in k8s-staging-ci-images:

KUBERNETES_CI_IMAGES                                            K8S_STAGING_CI_IMAGES
bazel							      <
cloud-controller-manager				      <
cloud-controller-manager-amd64				      <
cloud-controller-manager-arm				      <
cloud-controller-manager-arm64				      <
cloud-controller-manager-ppc64le			      <
cloud-controller-manager-s390x				      <
conformance							conformance
conformance-amd64						conformance-amd64
conformance-arm							conformance-arm
conformance-arm64						conformance-arm64
conformance-ppc64le						conformance-ppc64le
conformance-s390x						conformance-s390x
hyperkube							hyperkube
hyperkube-amd64							hyperkube-amd64
hyperkube-arm							hyperkube-arm
hyperkube-arm64							hyperkube-arm64
hyperkube-ppc64le						hyperkube-ppc64le
hyperkube-s390x							hyperkube-s390x
kube-aggregator						      <
kube-aggregator-amd64					      <
kube-aggregator-arm					      <
kube-aggregator-arm64					      <
kube-aggregator-ppc64le					      <
kube-aggregator-s390x					      <
kube-apiserver							kube-apiserver
kube-apiserver-amd64						kube-apiserver-amd64
kube-apiserver-arm						kube-apiserver-arm
kube-apiserver-arm64						kube-apiserver-arm64
kube-apiserver-ppc64le						kube-apiserver-ppc64le
kube-apiserver-s390x						kube-apiserver-s390x
kube-controller-manager						kube-controller-manager
kube-controller-manager-amd64					kube-controller-manager-amd64
kube-controller-manager-arm					kube-controller-manager-arm
kube-controller-manager-arm64					kube-controller-manager-arm64
kube-controller-manager-ppc64le					kube-controller-manager-ppc64le
kube-controller-manager-s390x					kube-controller-manager-s390x
kube-proxy							kube-proxy
kube-proxy-amd64						kube-proxy-amd64
kube-proxy-arm							kube-proxy-arm
kube-proxy-arm64						kube-proxy-arm64
kube-proxy-ppc64le						kube-proxy-ppc64le
kube-proxy-s390x						kube-proxy-s390x
kube-scheduler							kube-scheduler
kube-scheduler-amd64						kube-scheduler-amd64
kube-scheduler-arm						kube-scheduler-arm
kube-scheduler-arm64						kube-scheduler-arm64
kube-scheduler-ppc64le						kube-scheduler-ppc64le
kube-scheduler-s390x						kube-scheduler-s390x

also according to https://cs.k8s.io/?q=kubernetes-ci-images&i=nope&files=&excludeFiles=&repos=, the repos seeming to still be using them (ignoring documentation related) are:

all of which only have minimal references.

@ameukam
Copy link
Member

ameukam commented Mar 31, 2021

About the images :

  • kube-aggretator has not been published since 1.12
  • cloud-controller-manager has not been publish since 1.16
    So we don't really need to migrate them but we make sure they can be consumed. That's you find reference of kubernetes-ci-images in k/release.

IMO, we should announce a maintenance window before we made changes in other repos outside of k/test-infra.

@ameukam
Copy link
Member

ameukam commented Apr 15, 2021

/milestone v1.22

@k8s-ci-robot k8s-ci-robot modified the milestones: v1.21, v1.22 Apr 15, 2021
@fejta-bot
Copy link

Issues go stale after 90d of inactivity.
Mark the issue as fresh with /remove-lifecycle stale.
Stale issues rot after an additional 30d of inactivity and eventually close.

If this issue is safe to close now please do so with /close.

Send feedback to sig-contributor-experience at kubernetes/community.
/lifecycle stale

@k8s-ci-robot k8s-ci-robot added the lifecycle/stale Denotes an issue or PR has remained open with no activity and has become stale. label Jul 14, 2021
@spiffxp
Copy link
Member Author

spiffxp commented Jul 15, 2021

/remove-lifecycle stale
This is being worked on in concert with gs://kubernetes-release-dev migration

@k8s-ci-robot k8s-ci-robot removed the lifecycle/stale Denotes an issue or PR has remained open with no activity and has become stale. label Jul 15, 2021
@spiffxp
Copy link
Member Author

spiffxp commented Jul 27, 2021

Intend to call this done once deprecation window is announced for #2318, so keeping in v1.22

There are some repos that reference this? But it's mostly in docs, and most of them are historical docs that probably shouldn't be updated. https://cs.k8s.io/?q=kubernetes-ci-images&i=nope&files=&excludeFiles=&repos=

@spiffxp
Copy link
Member Author

spiffxp commented Jul 29, 2021

I've updated #2318 to track removing any remaining non necessary/historically-archived references to kubernetes-ci-images across the kubernetes project

I will close this once I've opened a followup issue to turn down this project post-deprecation-window

@spiffxp
Copy link
Member Author

spiffxp commented Aug 4, 2021

/close
Followup issue to close (milestoned to v1.25) is #2417

@k8s-ci-robot
Copy link
Contributor

@spiffxp: Closing this issue.

In response to this:

/close
Followup issue to close (milestoned to v1.25) is #2417

Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes/test-infra repository.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area/artifacts Issues or PRs related to the hosting of release artifacts for subprojects area/release-eng Issues or PRs related to the Release Engineering subproject priority/important-soon Must be staffed and worked on either currently, or very soon, ideally in time for the next release. sig/cluster-lifecycle Categorizes an issue or PR as relevant to SIG Cluster Lifecycle. sig/release Categorizes an issue or PR as relevant to SIG Release. sig/testing Categorizes an issue or PR as relevant to SIG Testing.
Projects
None yet
Development

No branches or pull requests

6 participants