-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Multistage Docker Build Cache #3831
Comments
This seems like a reasonable idea - but also not a high priority for the team at the moment. If you or someone else is willing to make a design proposal, I'd be eager to check it out: https://github.com/GoogleContainerTools/skaffold/tree/master/docs/design_proposals Help wanted! |
I would love to contribute to it, but I don't know Go, and currently not in a place to learn it. Hoping if there are any other awesome Go devs who could? |
Happy to jump on zoom calls and that to talk through it |
@no1melman Please join skaffold-users@. We have a bi-weekly community office hours. More information here https://github.com/GoogleContainerTools/skaffold#community If that does not work, please let us know and we can schedule something as per your convenience. |
@no1melman just FYI our next office hours is wednesday 06/24 at 9:30 AM PST! |
Yup I should be there. |
@no1melman would be great if you could provide a sample project that we can test this out on! |
Is the only way to support multistage docker build cache like discussed here to have separate docker builds in the skaffold build section like below (below being the skaffold rep of what is in that link):
There is no "auto" feature that scans your dockerfile first and caches necessary stages...
This seems bad to me for the reasons of tagging... if you have multiple CD pipelines running, they can't all read and write to
helloworld:compile-stage
, but if we left it up to the tag policy, how do you get that injected into theruntime-image
stageThe text was updated successfully, but these errors were encountered: