Fix playlist import race condition on large libraries (#127)Import playlists issue #192
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Summary
Fixes #127 - Playlist imports were failing on large libraries due to a race condition between database writes and StateFlow cache updates.
This PR implements a clean architectural solution by ensuring write operations fully complete (including cache synchronization) before returning.
Root Cause
When importing large music libraries (7000+ songs):
StateFlowcache inLocalSongRepositoryupdated asynchronously via Room'sInvalidationTracker.firstOrNull()on the StateFlowSolution
Made write operations synchronous with cache updates:
insertUpdateAndDelete()inLocalSongRepositoryto wait for the StateFlow cache to synchronize before returningsongsRelay.first { it != null }to suspend until the cache reflects database changesWhy this approach:
Testing
The fix ensures that:
insertUpdateAndDelete()returns, the StateFlow cache is synchronizedgetSongs().firstOrNull()return fresh dataChanges
LocalSongRepository.kt:71-84
insertUpdateAndDelete()No changes to:
SongRepositoryinterface (maintains clean abstraction)MediaImporterlogic (no workarounds needed)