-
-
Notifications
You must be signed in to change notification settings - Fork 1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add sample caching #7058
base: master
Are you sure you want to change the base?
Add sample caching #7058
Conversation
Thank you @messmerd. I closed my own implementation in favor of this one, so its not just an alternative. We can go with this. You probably did it better than me by just glancing at the code 🤣, though I'll take a deeper look soon. Could rename the PR if you want now. |
@sakertooth Thanks for the review! I'll address it later today. |
I'm thinking of starting this up again on my end given all the differences, but honestly, I don't know. Its just because I dont like the idea of sacrifing good design for performance (not until we have measurements), and I want there to be some tests, but the current design seems hard to do that. Identifying actual audio data in the cache with UUIDs and retrieving resources that way makes a lot of sense. It gives us the benefit that it works for any kind of data, and the dependency on the file system is greatly reduced. Its testable because I can cache any kind of audio data I want, it doesn't have to be a file exactly. |
My thoughts, hopefully everything I say below is easy to understand: Not sure if I mentioned this already, but my philosophy with Speaking of coupling, it might be simpler for the cache to be in its own class, outside The eviction ideas I was pondering about was a misunderstanding on my end. We call this feature a "cache", and when you think of cache, naturally eviction comes to mind. However, while we can continue to call it cache, it seems best to implement the feature as centralized storage, where any client can ask for a sample by its file path and it will return the cached buffer (or load the buffer in, either because it was changed on the filesystem or does not exist in the cache yet). And as a centralized storage, there is little reason to worry about the size of the cache. Base64 data works as both its identification ( |
@sakertooth I agree with your points about decoupling |
This PR is my effort at a sample cache.
I originally made it to test some eviction strategy ideas that differed from that of #6703, but then I polished it up and now it's here for feedback.
Unfortunately I wasn't able to implement postponed eviction like I hoped, but I'm pretty satisfied with the immediate eviction that I settled on and the ease-of-use of the sample cache.
Features:
SampleBuffer
cache is built intoSampleLoader
and the choice is made when calling the sample loader whether to use sample caching or not. By default caching is on.std::shared_ptr<const SampleBuffer>
to do this.SampleBuffer
class to enforce its creation through either the providedcreate()
methods or through theSampleLoader
There was a discussion on Discord about the nuances of how samples should be identified, hashed, invalidated, what should happen upon filesystem changes, how different samples with the same name on disk should be treated, project files, project bundles, etc.
This PR does not address any of that. It just keeps what is hopefully the same exact behavior as what we have currently, only now with massive improvements to memory usage along with faster project load times when working with projects that use a lot of samples. The work of improving LMMS's behavior for all the different edge cases we can imagine could still happen in the future, but I don't think it has to happen here.