-
Notifications
You must be signed in to change notification settings - Fork 5.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Revisiting criteria and process for Alpha/Beta/GA #4000
Comments
/sig architecture pm release cc: @kubernetes/sig-pm @kubernetes/sig-release @kubernetes/release-team |
/cc |
Issues go stale after 90d of inactivity. If this issue is safe to close now please do so with Send feedback to sig-testing, kubernetes/test-infra and/or fejta. |
/remove-lifecycle stale |
New policy for avoiding perma-betas has merged: kubernetes/enhancements#1266 Along these lines we need:
We need to make sure that if we up the bar for beta we do not push things to be left in alpha which is worse - and we found in the PRR survey that a shocking 57% of orgs with 101-1000 nodes under management had enabled an alpha feature in production. |
/remove-sig pm |
Adding discussion notes raised over various Release Team (1.19) and Enhancements subproject meetings since June 2020 Process improvements and clarification needsGeneral
Alpha-related
Beta-related
|
Hey, this issue is marked as Frozen, but has had no activity since 2020. I'm going to close it if I don't hear from anyone in a week. |
As part of the ongoing effort to improve consistency, quality and reliability, we have been adding new gates along the feature development path, such as writing a KEP, API review, production readiness review, conformance test requirements, etc. Because of this, we need to revisit our guidance for graduation criteria and perhaps the feature release process. The KEP template has a section for graduation criteria, and we have this set of guidance, but that's embedded in the doc about changing APIs and needs a broader focus.
As for feature release process, we have the KEP process. A quick check though shows our current process is not working as well as I think we'd like. Either that or there's a lot less going on that we think. Of about 133 KEPs, only 13 show as actually "implemented":
When they are left in "provisional" and "implementable" state, even though at least some are certainly released in at least alpha or beta, it makes me wonder if we are properly reviewing the KEP graduation criteria as we merge/promote things. Maybe we are, but it's not clear, and I worry that we're not being very effective at reviewing the features and making sure that all the "i"s are dotted and "t"s are crossed.
So, the discussion I am trying to start is this:
As a starting point, some of the criteria for graduation at different levels could include:
- Real world usage statistics
The text was updated successfully, but these errors were encountered: