Skip to content

Object Pooling

James Courtney edited this page Nov 7, 2024 · 5 revisions

!!Deprecated!!

Object Pools have been removed from FlatSharp starting in version 7.8. It's possible the feature will be reintroduced in the future, but currently there are no plans. This page is preserved for historical reasons.

Object Pooling is an experimental feature in FlatSharp 7. The intent of Object Pooling is to reduce the number of allocations in environments that are sensitive to Garbage Collection. Object Pooling is opt-in, meaning it has to be specified when the FlatSharp compiler is invoked, and it affects the entire schema.

Object Pooling...

  • Is an advanced feature. If you don't know if you need it, you probably don't need it.
  • Is effective at reducing allocations.
  • Is generally slower than allocate-and-forget (in .NET 7). Sometimes much slower.
  • Requires users to be much more careful about tracking object references. It introduces an entire class of errors, such as use-after-free. Thread safety may also be a concern.
  • Only works on the FlatSharp deserialize paths (ie, anything that FlatSharp allocates for you). If you create an object of your own to serialize, you are still responsible for pooling it yourself.
  • Is not pluggable or extensible, for performance reasons. FlatSharp comes with a built-in object pool based on ConcurrentQueue<T>.

How to Enable

From CSProj File: Add <FlatSharpPoolable>true</FlatSharpPoolable> to your property group.

From FlatSharp.Compiler: Use the --gen-poolable true option.

Changes to Generated Code

Object Pooling adds a new interface to most generated objects: IPoolableObject.

// An object that can be returned to a pool.
public interface IPoolableObject
{
    // Returns this object to its pool.
    void ReturnToPool(bool unsafeForce = false);
}

FlatSharp implements IPoolableObject on all deserialized types. The implementation of ReturnToPool is slightly different depending upon the deserialization mode:

Mode Root Node Behavior Child Node Behavior
Lazy Object is returned to pool Object is returned to pool
Progressive Object tree is returned to pool No-op
Greedy Object tree is returned to pool No-op
GreedyMutable Object tree is returned to pool No-op

In Lazy mode, objects must be returned to the pool as you finish using them. This is because Lazy reconstitutes objects each time they are accessed, so there are no internal links. In other "non-Lazy" modes, FlatSharp maintains Parent --> Child pointers so that nodes do not need to be re-parsed each time. This means that only a single call from the Root node is necessary to return the entire object tree to the pool.

Unions are also implemented differently with Object Pooling enabled. FlatSharp has traditionally used C# structs (value types) for Unions to avoid extra allocations. However, when Pooling is enabled, Unions are generated as reference types that implement IPoolableObject.

FlatSharp's vector classes also now implement IPoolableObject. However, you will need to test for these since IList<T> cannot be modified to require IPoolableObject.

Sample Code

table Item
{
    A : int;
}

table MyTable (fs_serializer)
{
    Items : [ Item ];
}
// Configure the size of the object pool. Note that this is a soft limit, so Flatsharp may go over this from time to time.
// The limit applies to each distinct type of object returned to the pool.
FlatSharp.ObjectPool.MaxToRetain = 100;

// Lazy requires all objects to be returned individually. Note that this code is *correct* for other modes as well
// since all but the last call to ReturnToPool will be no-ops in non-lazy modes.
{
   var table = MyTable.Serializer.Parse(buffer, FlatBufferDeserializationOption.Lazy);
   var items = table.Items;
   int sum = 0;
   foreach (var item in items)
   {
       sum += item.A;
       item.ReturnToPool();
   }
   
   (items as IPoolableObject)?.ReturnToPool();
   table.ReturnToPool();
}

// Non-Lazy Modes are easier:
{
   var table = MyTable.Serializer.Parse(buffer, FlatBufferDeserializationOption.Progressive);
   var items = table.Items;
   int sum = 0;
   foreach (var item in items)
   {
       sum += item.A;
   }

   table.ReturnToPool();
}

Considerations

Object Pools require more care from the programmer than FlatSharp's default usage. FlatSharp does guarantee that the same object is not added to the pool more than once at the same time. However, the auto-generated implementations of IPoolableObject are not thread safe. Here are some potential bugs:

  • Force Return: The IPoolableObject interface accepts an optional boolean parameter (unsafeForce). User code should never set this parameter. It is reserved for FlatSharp's internal usage. Any use of ReturnToPool(true) may leave your deserialized objects in an undefined state.

  • Double-Return: While this won't result in the object being placed in the pool twice, once the object has been returned to the pool, it is immediately eligible to be used by a different deserialization operation (possibly on a different thread). So performing a second call to .ReturnToPool might lead to unexpected results on an entirely different thread. It could also be a harmless no-op depending upon the timing and the size of the object pool.

  • Use-After-Return: After an object is returned to the Object Pool, its internal fields are reset to their default values. It stays in this state while in the pool. Those fields are re-initialized when the object is retrieved from the pool. Depending upon the timing, use-after-return could result in incorrect data (since that object now points to a completely different FlatBuffer!) or simply an exception.

These failure modes will be familiar to any C++ developer, but may be unfamiliar to developers who primarily work with C#.

Support Policy

Given the potential for user error and the experimental nature of Object Pools, they are supported on a best-effort basis. If you require support, be prepared to distill the reproduction of the issue down to a minimal amount of code. Further, Object Pools may be substantially altered or removed from future versions of FlatSharp, depending upon feedback and effectiveness.