Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] Intermittent issue - Single process app getting stuck with no exception in Linq based query in Shared Engine Read-only connection #2204

Open
ghost opened this issue Jun 21, 2022 · 0 comments
Labels

Comments

@ghost
Copy link

ghost commented Jun 21, 2022

Version
LiteDB 5.0.12, .NET 4.7.2, Win11 Pro 64bit.

Describe the bug
When running Linq operation Find on ILiteDatabase's FileStorage object, I'm facing an intermittent issue of it getting stuck waiting on a different thread. I'm using a Read-only connection. Multiple threads access the same file but majority of them access it in Read-only mode. Writes are not very frequent.

After that, killing the app and re-opening the app also leads to the same issue and it's reproducible beyond that. I'm guessing maybe the file handle might not be released?

The LiteDatabase instance is created with a "using". But the LiteFileInfo objects that are extracted from the FileStorage using Linq are used by MyDataModel. Is that causing the shared engine to lock and not allow new threads to access the file? Or is there something else wrong in the way I'm accessing LiteDb? Given that this is synchronous, is there a way to add a timeout to avoid getting the thread stuck for too long? I see the ILiteDatabase has a Timeout property which according to the docs is during transactions so I'm not sure it will help here.

I found a couple of similar but not exactly the same bugs - #1839, #1538. Not sure if they have the same underlying reason for the behavior.

Code to Reproduce

// Connection string keys and values are case-insensitive.
const MyCacheFileName = "...";
public IEnumerable<MyDataModel> GetCache(string filter, string expectedValue)
{
            var bsonMapper = ....;
            using ILiteDatabase liteDb = new LiteDatabase(
                new ConnectionString
            {
                Filename = MyCacheFileName,
                Connection = ConnectionType.Shared,
                ReadOnly = true,
            }, bsonMapper);
            
            if (liteDb == null)
            {
                return Enumerable.Empty<MyDataModel>();
            }

            IEnumerable<MyDataModel> entries = liteDb.FileStorage?
                .Find(f => f.Metadata["xyz"] == filter ) // ************************* STUCK HERE *******************
                .Where(f => ....) 
                .Select(f => new MyDataModel(f));

            return entries;
}

// Caller
var cachedValues = GetCache(filter: "filter1", expectedValue: "val1");
if (!cachedVals.Any())
{
	cachedValues = GetCache(filter: "filter2", expectedValue: "val1");
}

// MyDataModel
public class MyDataModel
{
         public MyDataModel(LiteFileInfo<string> fileInfo)
        {
            this.FileInfo = fileInfo;
        }

        public LiteFileInfo<string> FileInfo { get; }
}

Expected behavior
Doesn't get stuck and returns control back to the caller. This is what happens when this bug doesn't occur and things work fine.

Screenshots/Stacktrace
Found an interesting exception which I think may be causing this. Looks like LiteDb is unable to dispose the Read-Only DB because it doesn't have write permissions. This looks to be the same as this bug #2200.

I still don't know why this happens only sometimes and not every time though.

Exception: Stream does not support writing.;    
at System.IO.FileStream.WriteSpan(ReadOnlySpan`1) + 0x4b9
at System.IO.FileStream.Write(Byte[], Int32, Int32) + 0x13a     
at LiteDB.Engine.DiskService.Write(IEnumerable`1, FileOrigin) + 0x371     
at LiteDB.Engine.WalIndexService.CheckpointInternal() + 0x406     
at LiteDB.Engine.WalIndexService.TryCheckpoint() + 0x59     
at LiteDB.Engine.LiteEngine.Dispose(Boolean) + 0x4b     
at LiteDB.SharedEngine.Dispose() + 0x1e     
at LiteDB.LiteDatabase.Dispose() + 0x21     
at MyClass.GetCache(String, String) + 0x7a9 

This is from a Windows process dump where this thread is shown to be hung.

# RetAddr               : Args to Child                                                           : Call Site
00 00007ffc`aa64cbc0     : 00000000`00000000 00000001`00000000 00000257`65e01c58 00007ffc`64a32c56 : ntdll!ZwWaitForMultipleObjects+0x14 [minkernel\ntdll\daytona\objfre\amd64\usrstubs.asm @ 907] 
01 00007ffc`649b7b5f     : 00000000`00000000 00000000`00000000 00000000`00000000 00000000`ffffffff : KERNELBASE!WaitForMultipleObjectsEx+0xf0 [minkernel\kernelbase\synch.c @ 1551] 
02 00007ffc`649b73a9     : 00000257`66fed740 00007ffc`741a2702 00000257`66fee6c8 00000257`66fd7644 : SharedLibrary!$23___Interop::kernel32_dll.WaitForMultipleObjectsEx+0x7f
03 (Inline Function)     : --------`-------- --------`-------- --------`-------- --------`-------- : SharedLibrary!Interop::mincore::WaitForMultipleObjectsEx+0x16 [WaitForMultipleObjectsEx @ 16707565] 
04 00007ffc`649b72fa     : 00000257`00000002 00000000`0001a57f 0000000d`65e01001 00007ffc`648d3221 : SharedLibrary!System::Threading::WaitHandle.WaitForMultipleObjectsIgnoringSyncContext+0x99 [f:\dd\ndp\fxcore\CoreRT\src\System.Private.CoreLib\src\System\Threading\WaitHandle.Windows.cs @ 67] 
05 00007ffc`649b71fa     : 00000000`00001710 00000257`66fd7808 00000257`66fcd8f8 00000000`00000001 : SharedLibrary!System::Threading::WaitHandle.WaitForSingleObject+0x7a [f:\dd\ndp\fxcore\CoreRT\src\System.Private.CoreLib\src\System\Threading\WaitHandle.Windows.cs @ 29] 
06 00007ffc`649b7196     : 000000c3`6bfff380 00007ffc`649b8277 00000257`65e51160 00007ffc`16054438 : SharedLibrary!System::Threading::WaitHandle.WaitOneCore+0xa [f:\dd\ndp\fxcore\CoreRT\src\System.Private.CoreLib\src\System\Threading\WaitHandle.Windows.cs @ 106] 
07 00007ffc`64a20a7f     : 00000257`66fd6f30 00007ffc`19594fa8 00000257`66fee500 00007ffc`19594fa8 : SharedLibrary!System::Threading::WaitHandle.WaitOneCore+0x66 [f:\dd\ndp\fxcore\CoreRT\src\System.Private.CoreLib\src\System\Threading\WaitHandle.cs @ 126] 
08 00007ffc`1b6d1394     : 00000257`66fd6f30 00007ffc`19594fa8 00000257`66fcdc88 00007ffc`16054438 : SharedLibrary!System::Threading::WaitHandle.WaitOne+0xf [f:\dd\ndp\fxcore\CoreRT\src\System.Private.CoreLib\src\System\Threading\WaitHandle.cs @ 135] 
09 00007ffc`1b6d20e5     : 00000257`6676a3c8 00000000`00000000 00000257`66fd7878 00000257`66fee5b8 : MyApp!$25_LiteDB::SharedEngine.OpenDatabase+0x44
0a 00007ffc`19c7c6ca     : 00000000`00000004 00007ffc`64b106e9 00000000`00000000 00007ffc`19c7c687 : MyApp!$25_LiteDB::SharedEngine.Query+0x15
0b 00007ffc`19c16914     : 00007ffc`16e6c749 00007ffc`1622acf0 00000257`66fee628 00000257`66fee660 : MyApp!$25_LiteDB::LiteQueryable$1<System::__Canon>.ExecuteReader+0x2a
0c 00007ffc`648c40d8     : 00000257`6676e2f8 00007ffc`648c4227 00000257`665e9780 00000257`66fd7808 : MyApp!$25_LiteDB::LiteQueryable$1<System::__Canon>::<ToDocuments>d__26.MoveNext+0x44
0d 00007ffc`19d3f07f     : 00000257`66fec758 00007ffc`64a18aa9 00007ffc`16ab5bb8 00000257`66fd7808 : SharedLibrary!$4_System::Linq::Enumerable::SelectEnumerableIterator$2<System::__Canon,System::__Canon>.MoveNext+0x68 [C:\A\1\5\s\corefx\src\System.Linq\src\System\Linq\Select.cs @ 135] 
0e 00007ffc`648c36c8     : 00000257`6676e100 00007ffc`648c4227 00000257`66fcdc88 00007ffc`64bed0d4 : MyApp!$25_LiteDB::LiteStorage$1<System::__Canon>::<Find>d__5.MoveNext+0xff
0f 00007ffc`19fee4ee     : 00000257`66fee488 00007ffc`64b106e9 00007ffc`647445d0 00000257`66fee458 : SharedLibrary!$4_System::Linq::Enumerable::WhereSelectEnumerableIterator$2<System::__Canon,System::__Canon>.MoveNext+0x68 [C:\A\1\5\s\corefx\src\System.Linq\src\System\Linq\Where.cs @ 390] 
10 00007ffc`1b6ddd0b     : 00007ffc`176a6d80 00000257`82cb41d8 00007ffc`193fe93c 00007ffc`647445d0 : MyApp!$160_System::Linq::Enumerable.Any<System.__Canon>+0x4e
11 00007ffc`1ad4e0e9     : 0000f574`17b0771d 00000000`00000020 00000000`0000ffff 00000000`00000001 : MyApp!$7_MyCode::MyClass.GetCache+0x6cb [C:\w\117\s\MyApp\.....\MyClass.cs @ 96] 

Additional context
N/A

@ghost ghost added the bug label Jun 21, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

0 participants