Skip to content

Memory<T> and large memory mapped files #24805

@hexawyz

Description

@hexawyz

I'm currently experimenting with OwnedMemory<T> and Memory<T> in an existing project that I'm trying to improve, and I ran into an issue with OwnedMemory<T> and Memory<T> being limited to int.MaxValue.

Scenario

I have a relatively big (> 2GB) data file that I want to fully map in memory (i.e. a database). My API exposes methods that returns subsets of this big memory mapped file, e.g.

public ReadOnlyMemory<byte> GetBytes(int something)
{
    // …
    return mainMemory.Slice(start, length).AsReadOnly();
}

Wrapping the MemoryMappedFile and associated MemoryMappedViewAccessor into an OwnedMemory<byte> seemed to be a good idea, since most of the tricky logic would then be handled by the framework.

Problem

The memory block that I want to wrap is bigger than 2GB and cannot currently be represented by a single Memory instance.
Since Memory can only work with T[], string, or OwnedMemory<T>, it seems that having to give up on the straightfoward OwnedMemory<T> implementation also means that I have to give up on using Memory<T> at all.

(In this specific case, Span<T> being limited to 2GB, would not be a problem, because the sliced memory blocks that my API would return would always be much smaller than that.)

Possible solutions with the currently proposed API

  • Not using Memory<T> at all and implementing a much simplified version of OwnedMemory<T>/Memory<T> that would fit my use case
  • Keeping many overlapping instances of OwnedMemory<T> around and use the one that best fits the current case

Question

Would it be possible to improve the framework in order to be able of easily working with such large memory blocks? (Maybe implementing something like a BigMemory<T> ?)

Metadata

Metadata

Assignees

Type

No type

Projects

No projects

Milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions