Skip to content

Opt-in client kubernetes version success tracking / ability to test each version of kubernetes w/ each provider #1585

@lknite

Description

@lknite

Is your feature request related to a problem? Please describe.

If I want to build using kubernetes v1.29.6, I can't just specify that version, I have to specify multiple versions because apparently the debian package might be named differently.

Describe the solution you'd like

If someone specifies the needed versions and are able to build v1.29.6 successfully, why not register those values online somewhere, then others could specify the version and use the referenced success of others.

Describe alternatives you've considered

export VERSION=v1.31.1

# example values:
#KUBERNETES_RPM_VERSION=1.29.6
#KUBERNETES_SEMVER=v1.29.6
#KUBERNETES_SERIES=v1.29
#KUBERNETES_DEB_VERSION=1.29.6-1.1

# generate needed values based on provided version
KUBERNETES_RPM_VERSION=$(echo $VERSION | cut -d 'v' -f 2)
KUBERNETES_SEMVER=$VERSION
KUBERNETES_SERIES=$(echo $VERSION | cut -d '.' -f 1).$(echo $VERSION | cut -d '.' -f 2)
KUBERNETES_DEB_VERSION=$(echo $VERSION | cut -d 'v' -f 2)-1.1

Additional context

I'm planning to setup automation which watches for a new release of kubernetes and then automatically generates the needed images using image-builder, storing the results locally. I will do my best to fully automate this process working to figure out the needed versions using strategies like above and perhaps additional steps to try and additionally look up the packages.

Each time the image build is successful, I could provide the versions I used to the image-builder project if only there was a place to upload the values to.

Really, probably the image-builder project should be building new images each time kubernetes releases a new version and testing them for success by deploying out a cluster. If that did occur, then again, the working versions could be provided via some sort of programmatically accessible page.

So, whether I provide the values, or many volunteers provide the values, or the image-builder project themselves provide the values, there still needs to be a place to store those values.

Potential storage

Maybe something like:
image-builder / images / capi / packer / proxmox / known.json

{
  "v1.29.6": {
    "rpm_version": "1.29.6"
    "semver": "v1.29.6"
    "series": "v1.29"
    "deb_version": "1.29.6-1.1"
    "success": [
      "count": 25,
      "proxmox_version": "8.2.7",
      "packer_version": "...",
      "other involved versions": "..."
    ]
  }
}

Instead of a file in a git repo it might make more sense to have it be a REST API, a POST that listens for success metrics, and a GET that returns known values.

If there was an opt-in which explained the type of data gathered and how it lets us know which versions of things work successfully, and that helps us all over-all to have a better experience, I suspect there will be a lot of participation... but it'd only need a little bit to be successful.

Maybe at first it could be volunteer generated versions, but after testing is fully automated, the image-builder project would take over delivering those versions making the data more reliable. Users could chose to use the public version data or not.

This could also make for a nice stats page maybe, if lots of people participate.

Maybe its own project used by this one, "image-builder-metrics"? Or just a couple more paths on the existing documentation website url?

Alternatively

Another way to think of all this could be as a TestResultsPage, where we would essentially be listing out all the different combinations of variables (within reason) and then mark each combination after testing as whether those versions resulted in success. With this feature suggesting we could get some of that data from the users since such thorough testing is not yet part of the image-builder project. However, once a list of test combinations is created, that then kind of creates the possibility for folks to put together solutions to test out all those combinations. So, probably just as good an idea as any to start with a list of what is to be tested and a way to update that list.

/kind feature

Metadata

Metadata

Assignees

No one assigned

    Labels

    kind/featureCategorizes issue or PR as related to a new feature.lifecycle/rottenDenotes an issue or PR that has aged beyond stale and will be auto-closed.

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions