Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ci: Linux builds growing for each new build #153

Open
FredrikNoren opened this issue Feb 23, 2023 · 11 comments
Open

ci: Linux builds growing for each new build #153

FredrikNoren opened this issue Feb 23, 2023 · 11 comments
Assignees
Labels
meta:good first issue Good for newcomers

Comments

@FredrikNoren
Copy link
Contributor

Something strange is going on with the ci; it seems like the Linux build cache gets bigger and bigger with each new build. I haven't been able to figure out why yet. It seems like Windows and Mac works fine.

@FredrikNoren FredrikNoren added the meta:good first issue Good for newcomers label Feb 23, 2023
@philpax
Copy link
Contributor

philpax commented Feb 23, 2023

Do we have any information about the contents of the cache?

@FredrikNoren
Copy link
Contributor Author

No I was trying to see if there's a way to download them but couldn't find any, think github doesn't allow it. Was playing around with saving it to artifacts here but didn't look at it yet: #125

@philpax
Copy link
Contributor

philpax commented Feb 23, 2023

I'm about to bust the Linux cache to allow #156 to build, so we'll need to collect data again.

@daniellavoie
Copy link
Contributor

I've customized the build on my own fork to generate a tree target output.

https://github.com/daniellavoie/Ambient/blob/debug-cache/cache.log

I'm currently running another build and will git diff to see if consecutive builds creates new entries. Meanwhile, the output should still help figuring out if there is any suspicious file out there.

@daniellavoie
Copy link
Contributor

Just found out tree --du -h will rerun the same flow with this command to get a better idea of what's what.

@daniellavoie
Copy link
Contributor

daniellavoie commented Feb 23, 2023

git diff between two builds on the same commit. Doesn't show much. Running again with tree --du-h.

git diff
diff --git a/cache.log b/cache.log
index 354e187..b049d80 100644
--- a/cache.log
+++ b/cache.log
@@ -761,7 +761,7 @@ target
 │   │   │   │   ├── 643449dc0c65ac0e-ScNPhaseCore.o
 │   │   │   │   ├── 643449dc0c65ac0e-ScPhysics.o
 │   │   │   │   ├── 643449dc0c65ac0e-ScRigidCore.o
-│   │   │   │   ├── 643449dc0c65ac0e-ScRigidSim.o
+│��  │   │   │   ├── 643449dc0c65ac0e-ScRigidSim.o
 │   │   │   │   ├── 643449dc0c65ac0e-ScScene.o
 │   │   │   │   ├── 643449dc0c65ac0e-ScShapeCore.o
 │   │   │   │   ├── 643449dc0c65ac0e-ScShapeInteraction.o
@@ -1077,7 +1077,7 @@ target
 │   │   │   ├── out
 │   │   │   ├── output
 │   │   │   ├── root-output
-│   │   │   ��── stderr
+│   │   │   └── stderr
 │   │   ├── proc-macro-error-attr-c0559f9ae7e5fd5b
 │   │   │   ├── build-script-build
 │   │   │   ├── build_script_build-c0559f9ae7e5fd5b

@daniellavoie
Copy link
Contributor

This versions provides directory space usage if that helps: https://github.com/daniellavoie/Ambient/blob/6ae44f6d8ff5333fef73854632b995df3a1b13a2/cache.log

Running again to see if there is an incremental behaviour.

@daniellavoie
Copy link
Contributor

Last update, no size difference. Someone from the core team must review the cache output from my previous comment and identify if any of the pulled dependencies make sense. Cheers!

@FredrikNoren
Copy link
Contributor Author

@daniellavoie Ah this is awesome! Hm I can't tell if anything is wrong from looking at that file, but it's usually quite noticeable when the build cache starts growing; it grows with like 1gb per build. I'm still not sure exactly what triggers it though, sometimes it seems like even just doc changes will make it grow.

@daniellavoie
Copy link
Contributor

So after a few build. we endup with 5Gb of deps. Here's a git diff:

diff.log

Not sure what is the right approach to clean the duplicas deps.

@FredrikNoren
Copy link
Contributor Author

@daniellavoie Hm interesting. Could you open an issue on the cargo repo? I feel like our build is fairly standard so this must be a problem for more people I imagine?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
meta:good first issue Good for newcomers
Projects
None yet
Development

No branches or pull requests

3 participants