Description
openedon Nov 29, 2023
Background and motivation
We have two available memory cache libraries in .NET that are popular and well known - System.Runtime.Caching
and Microsoft.Extensions.Memory.Cache
. As already described in this issue there is a room for improvement in their public APIs. There are also efficiency problems that hit at high scale. Thus, our goal was to implement as efficient memory cache as possible for high concurrency and high traffic servers. During the process we've found a few additional opportunities to do slightly better than existing libraries.
- Our implementation is fully generic for both keys and values, offering greater convenience and potentially improving memory efficiency by avoiding unnecessary boxing and type conversions.
- Our implementation delivers noticeable performance improvements, with reduced latency and greater efficiency across operations.
- Both
System.Runtime.Caching
andMicrosoft.Extensions.Caching.Memory
use mechanisms that may involve background threads or timers for managing expiration. In contrast, our implementation maintains state directly within set and get methods, avoiding thread pool overhead. Since it doesn't implementIDisposable
, there's also no risk of memory leaks from incorrect disposal, unlikeSystem.Runtime.Caching
, which allocates timers on construction. - The API design can lead to inefficiencies by requiring heap-allocated objects, potentially increasing memory pressure at runtime.
- No metrics are emitted by default, which is important for monitoring in distributed systems.
RCache implementation is based on open source library BitFaster.
Storing pair of <int, int> (this is the best situation for RCache since no boxing on key and value).
The latency includes optional cost of maintaining telemetry state on RCache side.
Cache library | Remove | TryGetMiss | TryGetHit | GetOrSet | GetOrSet Dynamic TTL | Set | SetDynamicTTL |
---|---|---|---|---|---|---|---|
RCache | 6.5 ns | 10.0 ns | 11.4 ns | 14.7 ns | 15.6 ns | 28.9 ns | 32.2 ns |
Microsoft.Extensions.Caching.Memory | 59.3 ns | 39.0 ns | 48.4 ns | 52.1 ns | 44.9 ns | 125.5 ns | 130.2 ns |
System.Runtime.Caching | 59 ns | 54.8 ns | 107.0 ns | 175.8 ns | 239.7 ns | 1192.5 ns | 1216.1 ns |
Feature comparison table:
Feature | RCache | Microsoft.Extensions.Caching.Memory | System.Runtime.Caching |
---|---|---|---|
Time-based eviction | yes | yes | yes |
Sliding time to evict | yes | yes | yes |
Callbacks on eviction | no | yes | yes |
Metrics | yes | no | no |
Named caches | yes | no | no |
Generics support | yes | no | no |
Priority based eviction | yes** | yes | no |
Runtime entry size calculation | no | yes | no |
Dynamic Time To Evict | yes | yes | yes |
Item update notification | no | yes | no |
** Algorithm we use has a notion of three priorities (hot, warm, cold) and respect them while rotating items. Though, we don't allow to define priorities by customer or have direct control over it.
API Proposal
namespace Microsoft.Extensions.Cache.Memory;
/// <summary>
/// A synchronous in-memory object cache.
/// </summary>
/// <typeparam name="TKey">Type of keys stored in the cache.</typeparam>
/// <typeparam name="TValue">Type of values stored in the cache.</typeparam>
public abstract class RCache<TKey, TValue> : IEnumerable<KeyValuePair<TKey, TValue>>
where TKey : notnull
{
/// <summary>
/// Gets the name of the cache instance.
/// </summary>
/// <remarks>
/// This name is used to identity this cache when publishing telemetry.
/// </remarks>
public abstract string Name { get; }
/// <summary>
/// Gets the capacity of the cache, which represents the maximum number of items maintained by the cache at any one time.
/// </summary>
public abstract int Capacity { get; }
/// <summary>
/// Tries to get a value from the cache.
/// </summary>
/// <param name="key">Key identifying the requested value.</param>
/// <param name="value">Value associated with the key or <see langword="null"/> when not found.</param>
/// <returns>
/// <see langword="true"/> when the value was found, <see langword="false"/> otherwise.
/// </returns>
public abstract bool TryGet(TKey key, [MaybeNullWhen(false)] out TValue value);
/// <summary>
/// Sets a value in the cache.
/// </summary>
/// <remarks>
/// The value's time to expire is set to the global default value defined for this cache instance.
/// </remarks>
/// <param name="key">Key identifying the value.</param>
/// <param name="value">Value to associate with the key.</param>
public abstract void Set(TKey key, TValue value);
/// <summary>
/// Sets a value in the cache.
/// </summary>
/// <remarks>
/// After time to expire has passed, a value is not retrievable from the cache.
/// At the same time cache might keep the root for it for some time.
/// </remarks>
/// <param name="key">Key identifying the value.</param>
/// <param name="value">Value to cache on the heap.</param>
/// <param name="timeToExpire">Amount of time the value is valid, after which it should be removed from the cache.</param>
/// <exception cref="ArgumentOutOfRangeException">If <paramref name="timeToExpire"/> is less than 1 millisecond.</exception>
public abstract void Set(TKey key, TValue value, TimeSpan timeToExpire);
/// <summary>
/// Gets a value or sets it if doesn't exist.
/// </summary>
/// <remarks>
/// The value's time to expire is set to the global default value defined for this cache instance.
/// </remarks>
/// <typeparam name="TState">Type of the state passed to the function.</typeparam>
/// <param name="key">Key identifying the value.</param>
/// <param name="state">State passed to the factory function.</param>
/// <param name="factory">A function used to create a new value if the key is not found in the cache. It returns the value to be cached.</param>
/// <returns>
/// Data retrieved from the cache or created by the passed factory function.
/// </returns>
public abstract TValue GetOrSet<TState>(TKey key, TState state, Func<TKey, TState, TValue> factory);
/// <summary>
/// Gets a value or sets it if doesn't exist.
/// </summary>
/// <remarks>
/// After time to expire has passed, a value is not retrievable from the cache.
/// At the same time cache might keep the root for it for some time.
/// </remarks>
/// <typeparam name="TState">The type of the state object passed to the factory function.</typeparam>
/// <param name="key">The key identifying the cached value.</param>
/// <param name="state">An additional state object passed to the factory function.</param>
/// <param name="factory">
/// A function used to create a new value if the key is not found in the cache. The function returns a tuple where the
/// first item is the value to cache, and the second item is a <see cref="TimeSpan"/> representing the duration for which
/// the value remains valid in the cache before expiring.
/// </param>
/// <returns>
/// The value associated with the key, either retrieved from the cache or created by the factory function.
/// </returns>
public abstract TValue GetOrSet<TState>(TKey key, TState state,
Func<TKey, TState, (TValue value, TimeSpan timeToExpire)> factory);
/// <summary>
/// Gets a value or sets it if doesn't exist.
/// </summary>
/// <remarks>
/// The value's time to expire is set to the global default value defined for this cache instance.
/// </remarks>
/// <param name="key">Key identifying the value.</param>
/// <param name="factory">A function used to create a new value if the key is not found in the cache. It returns the value to be cached.</param>
/// <returns>
/// Data retrieved from the cache or created by the passed factory function.
/// </returns>
public abstract TValue GetOrSet(TKey key, Func<TKey, TValue> factory);
/// <summary>
/// Gets a value or sets it if doesn't exist.
/// </summary>
/// <remarks>
/// After time to expire has passed, a value is not retrievable from the cache.
/// At the same time cache might keep the root for it for some time.
/// </remarks>
/// <param name="key">Key identifying the value.</param>
/// <param name="factory">
/// A function used to create a new value if the key is not found in the cache. The function returns a tuple where the
/// first item is the value to cache, and the second item is a <see cref="TimeSpan"/> representing the duration for which
/// the value remains valid in the cache before expiring.
/// </param>
/// <returns>
/// Data retrieved from the cache or created by the passed factory function.
/// </returns>
public abstract TValue GetOrSet(TKey key, Func<TKey, (TValue value, TimeSpan timeToExpire)> factory);
/// <summary>
/// Gets the current number of items in the cache.
/// </summary>
/// <returns>
/// This method is inherently imprecise as threads may be asynchronously adding
/// or removing items. In addition, this method may be relatively costly, so avoid
/// calling it in hot paths.
/// </returns>
public abstract int GetCount();
/// <summary>
/// Attempts to remove the value that has the specified key.
/// </summary>
/// <param name="key">Key identifying the value to remove.</param>
/// <returns>
/// <see langword="true"/> if the value was removed, and <see langword="false"/> if the key was not found.
/// </returns>
public abstract bool Remove(TKey key);
/// <summary>
/// Attempts to remove the specified key-value pair from the cache.
/// </summary>
/// <param name="key">Key identifying the value to remove.</param>
/// <param name="value">The value associated with the key to remove.</param>
/// <returns>
/// <see langword="true"/> if the specified key-value pair was found and removed;
/// otherwise, <see langword="false"/>.
/// </returns>
/// <remarks>
/// This method checks both the key and the associated value for a match before removal.
/// If the specified key exists but is associated with a different value, the cache remains unchanged.
/// </remarks>
public abstract bool Remove(TKey key, TValue value);
/// <summary>
/// Removes all expired entries from the cache.
/// </summary>
/// <remarks>
/// Some implementations perform lazy cleanup of cache resources. This call is a hint to
/// ask the cache to try and synchronously do some cleanup.
/// </remarks>
public abstract void RemoveExpiredEntries();
/// <summary>
/// Returns an enumerator that iterates through the items in the cache.
/// </summary>
/// <returns>
/// An enumerator for the cache.
/// </returns>
/// <remarks>
/// This can be a slow API and is intended for use in debugging and diagnostics, avoid using in production scenarios.
///
/// The enumerator returned from the cache is safe to use concurrently with
/// reads and writes, however it does not represent a moment-in-time snapshot.
/// The contents exposed through the enumerator may contain modifications
/// made after <see cref="GetEnumerator"/> was called.
/// </remarks>
public abstract IEnumerator<KeyValuePair<TKey, TValue>> GetEnumerator();
/// <summary>
/// Returns an enumerator that iterates through the items in the cache.
/// </summary>
/// <returns>
/// An enumerator for the cache.
/// </returns>
/// <remarks>
/// This can be a slow API and is intended for use in debugging and diagnostics, avoid using in production scenarios.
///
/// The enumerator returned from the cache is safe to use concurrently with
/// reads and writes, however it does not represent a moment-in-time snapshot.
/// The contents exposed through the enumerator may contain modifications
/// made after <see cref="GetEnumerator"/> was called.
/// </remarks>
IEnumerator IEnumerable.GetEnumerator() => GetEnumerator();
}
/// <summary>
/// Options for LRU (Least Recently Used) implementations of <see cref="RCache{TKey, TValue}"/>.
/// </summary>
/// <typeparam name="TKey">Type of keys stored in the cache.</typeparam>
public class RCacheLruOptions<TKey>
where TKey : notnull
{
/// <summary>
/// Gets or sets the maximum number of items that can be stored in the cache.
/// </summary>
/// <value>
/// Defaults to 1024.
/// </value>
[Range(3, int.MaxValue - 1)]
public int Capacity { get; set; } = 1024;
/// <summary>
/// Gets or sets the default time to evict individual items from the cache.
/// </summary>
/// <remarks>
/// This value is used by methods which do not accept an explicit time to evict parameter.
/// If you don't want your items to be ever evicted due to time, set this value to <see cref="TimeSpan.MaxValue"/>.
/// </remarks>
/// <value>
/// Defaults to 5 minutes.
/// </value>
[TimeSpan(minMs: 1)]
public TimeSpan DefaultTimeToEvict { get; set; } = TimeSpan.FromMinutes(5);
/// <summary>
/// Gets or sets the amount of time by which an items's eviction time is extended upon a cache hit.
/// </summary>
/// <remarks>
/// This value is ignored when <see cref="ExtendTimeToEvictAfterHit"/> is <see langword="false"/>.
/// </remarks>
/// <value>
/// Defaults to 5 minutes.
/// </value>
[TimeSpan(minMs: 1)]
public TimeSpan ExtendedTimeToEvictAfterHit { get; set; } = TimeSpan.FromMinutes(5);
/// <summary>
/// Gets or sets a value indicating whether an item's time to evict should be extended upon a cache hit.
/// </summary>
/// <value>
/// Defaults to <see langword="false"/>.
/// </value>
public bool ExtendTimeToEvictAfterHit { get; set; }
/// <summary>
/// Gets or sets the cache's level of concurrency.
/// </summary>
/// <remarks>
/// Increase this value if you observe lock contention.
/// </remarks>
/// <value>
/// Defaults to <see cref="Environment.ProcessorCount"/>.
/// </value>
[Range(1, int.MaxValue)]
public int ConcurrencyLevel { get; set; } = Environment.ProcessorCount;
/// <summary>
/// Gets or sets the custom time provider used for timestamp generation in the cache.
/// </summary>
/// <remarks>
/// If this value is <see langword="null"/>, the cache will default to using <see cref="Environment.TickCount64"/>
/// for timestamp generation to optimize performance. If a <see cref="TimeProvider"/> is set,
/// the cache will call the <see cref="TimeProvider.GetTimestamp"/> method.
/// The <see cref="TimeProvider"/> should primarily be used for testing purposes, where custom time manipulation is required.
/// </remarks>
/// <value>
/// Defaults to <see langword="null"/>.
/// </value>
public TimeProvider? TimeProvider { get; set; }
/// <summary>
/// Gets or sets the comparer used to evaluate keys.
/// </summary>
/// <value>
/// Defaults to <see cref="EqualityComparer{T}.Default"/>,
/// except for string keys where the default is <see cref="StringComparer.Ordinal"/>.
/// </value>
public IEqualityComparer<TKey> KeyComparer { get; set; }
= typeof(TKey) == typeof(string) ? (IEqualityComparer<TKey>)StringComparer.Ordinal : EqualityComparer<TKey>.Default;
/// <summary>
/// Gets or sets a value indicating how often cache metrics are refreshed.
/// </summary>
/// <remarks>
/// Setting this value too low can lead to poor performance due to the overhead involved in
/// collecting and publish metrics.
/// </remarks>
/// <value>
/// Defaults to 30 seconds.
/// </value>
[TimeSpan(min: "00:00:05")]
public TimeSpan MetricPublicationInterval { get; set; } = TimeSpan.FromSeconds(30);
/// <summary>
/// Gets or sets a value indicating whether metrics are published or not.
/// </summary>
/// <value>
/// Defaults to <see langword="true"/>.
/// </value>
public bool PublishMetrics { get; set; } = true;
}
/// <summary>
/// Builder for creating <see cref="RCache{TKey, TValue}"/> instances.
/// </summary>
/// <typeparam name="TKey">Type of keys stored in the cache.</typeparam>
/// <typeparam name="TValue">Type of values stored in the cache.</typeparam>
public class RCacheLruBuilder<TKey, TValue>
where TKey : notnull
{
/// <summary>
/// Initializes a new instance of the <see cref="RCacheLruBuilder{TKey,TValue}"/> class.
/// </summary>
/// <param name="name">Name of the cache, used in telemetry.</param>
/// <exception cref="ArgumentNullException">Thrown when the <paramref name="name"/> is null.</exception>
public RCacheLruBuilder(string name);
/// <summary>
/// Sets the options for the cache.
/// </summary>
/// <param name="options">Cache options.</param>
/// <returns>The current instance of the <see cref="RCacheLruBuilder{TKey,TValue}"/>.</returns>
/// <exception cref="ArgumentNullException">Thrown when the <paramref name="options"/> is null.</exception>
public RCacheLruBuilder<TKey, TValue> WithOptions(RCacheLruOptions<TKey> options);
/// <summary>
/// Sets the meter factory for the cache.
/// </summary>
/// <param name="meterFactory">Meter factory for telemetry.</param>
/// <returns>The current instance of the <see cref="RCacheLruBuilder{TKey,TValue}"/>.</returns>
public RCacheLruBuilder<TKey, TValue> WithMeterFactory(IMeterFactory? meterFactory);
/// <summary>
/// Builds the <see cref="RCache{TKey, TValue}"/> instance with the specified configurations.
/// </summary>
/// <returns>A ready-to-use <see cref="RCache{TKey, TValue}"/> instance.</returns>
/// <exception cref="OptionsValidationException">Thrown when the validation of options fails.</exception>
/// <exception cref="InvalidOperationException">Thrown when the meter factory is null but metrics publishing is enabled.</exception>
public RCache<TKey, TValue> Build();
}
/// <summary>
/// Extension methods for caching.
/// </summary>
public static class RCacheExtensions
{
/// <summary>
/// Adds an LRU (Least Recently Used) cache to the dependency injection container.
/// </summary>
/// <typeparam name="TKey">Type of keys stored in the cache.</typeparam>
/// <typeparam name="TValue">Type of values stored in the cache.</typeparam>
/// <param name="services">Dependency injection container to add the cache to.</param>
/// <returns>The value of<paramref name="services"/>.</returns>
/// <exception cref="ArgumentNullException">When passed <paramref name="services"/> are <see langword="null"/>.</exception>
public static IServiceCollection AddRCacheLru<TKey, TValue>(this IServiceCollection services)
where TKey : notnull;
/// <summary>
/// Adds a named LRU (Least Recently Used) cache to the dependency injection container.
/// </summary>
/// <typeparam name="TKey">Type of keys stored in the cache.</typeparam>
/// <typeparam name="TValue">Type of values stored in the cache.</typeparam>
/// <param name="services">Dependency injection container to add the cache to.</param>
/// <param name="name">Name of the cache, used in telemetry.</param>
/// <returns>The value of<paramref name="services"/>.</returns>
/// <exception cref="ArgumentNullException">When passed <paramref name="services"/> are <see langword="null"/>.</exception>
/// <exception cref="ArgumentException">When passed <paramref name="name"/> is <see langword="null"/> or empty.</exception>
public static IServiceCollection AddRCacheLru<TKey, TValue>(this IServiceCollection services, string name)
where TKey : notnull;
/// <summary>
/// Adds an LRU (Least Recently Used) cache to the dependency injection container.
/// </summary>
/// <typeparam name="TKey">Type of keys stored in the cache.</typeparam>
/// <typeparam name="TValue">Type of values stored in the cache.</typeparam>
/// <param name="services">Dependency injection container to add the cache to.</param>
/// <param name="configure">A function used to configure cache options.</param>
/// <returns>The value of<paramref name="services"/>.</returns>
/// <exception cref="ArgumentNullException">When passed <paramref name="services"/> or <paramref name="configure"/> are <see langword="null"/>.</exception>
public static IServiceCollection AddRCacheLru<TKey, TValue>(this IServiceCollection services, Action<RCacheLruOptions<TKey>> configure)
where TKey : notnull;
/// <summary>
/// Adds an LRU (Least Recently Used) cache to the dependency injection container.
/// </summary>
/// <typeparam name="TKey">Type of keys stored in the cache.</typeparam>
/// <typeparam name="TValue">Type of values stored in the cache.</typeparam>
/// <param name="services">Dependency injection container to add the cache to.</param>
/// <param name="section">Configuration part that defines cache options.</param>
/// <returns>The value of<paramref name="services"/>.</returns>
/// <exception cref="ArgumentNullException">When passed <paramref name="services"/> or <paramref name="section"/> are <see langword="null"/>.</exception>
public static IServiceCollection AddRCacheLru<TKey, TValue>(this IServiceCollection services, IConfigurationSection section)
where TKey : notnull;
/// <summary>
/// Adds a named LRU (Least Recently Used) cache to the dependency injection container.
/// </summary>
/// <typeparam name="TKey">Type of keys stored in the cache.</typeparam>
/// <typeparam name="TValue">Type of values stored in the cache.</typeparam>
/// <param name="services">Dependency injection container to add the cache to.</param>
/// <param name="name">Name of the cache, used in telemetry.</param>
/// <param name="section">Configuration part that defines cache options.</param>
/// <returns>The value of<paramref name="services"/>.</returns>
/// <exception cref="ArgumentNullException">When passed <paramref name="services"/> or <paramref name="section"/> are <see langword="null"/>.</exception>
/// <exception cref="ArgumentException">When passed <paramref name="name"/> is <see langword="null"/> or empty.</exception>
public static IServiceCollection AddRCacheLru<TKey, TValue>(this IServiceCollection services, string name,
IConfigurationSection section)
where TKey : notnull;
/// <summary>
/// Adds a named LRU (Least Recently Used) cache to the dependency injection container.
/// </summary>
/// <typeparam name="TKey">Type of keys stored in the cache.</typeparam>
/// <typeparam name="TValue">Type of values stored in the cache.</typeparam>
/// <param name="services">Dependency injection container to add the cache to.</param>
/// <param name="name">Name of the cache, used in telemetry.</param>
/// <param name="configure">A function used to configure cache options.</param>
/// <returns>The value of<paramref name="services"/>.</returns>
/// <exception cref="ArgumentNullException">When passed <paramref name="services"/> or <paramref name="configure"/> are <see langword="null"/>.</exception>
/// <exception cref="ArgumentException">When passed <paramref name="name"/> is <see langword="null"/> or empty.</exception>
public static IServiceCollection AddRCacheLru<TKey, TValue>(this IServiceCollection services, string name,
Action<RCacheLruOptions<TKey>> configure)
where TKey : notnull;
}
/// <summary>
/// Metrics published by <see cref="RCache{TKey, TValue}"/>.
/// </summary>
public static class RCacheMetrics
{
/// <summary>
/// Name of the <see cref="Meter"/> to listen to is the name of the cache.
/// </summary>
/// <example>
/// RCache with name "test" will publish metrics with tag "cache-name" equal to "test".
/// </example>
public const string LruCacheMeterName = "Microsoft.Extensions.Cache.Memory";
/// <summary>
/// Gets the total number of cache queries that were successful.
/// </summary>
/// <remarks>
/// Metric is exposed as <see cref="ObservableCounter{T}"/> with value being <see cref="long"/>.
/// </remarks>
public const string Hits = "rcache.hits";
/// <summary>
/// Gets the total number of unsuccessful cache queries.
/// </summary>
/// <remarks>
/// Metric is exposed as <see cref="ObservableCounter{T}"/> with value being <see cref="long"/>.
/// </remarks>
public const string Misses = "rcache.misses";
/// <summary>
/// Gets the total number of expired values.
/// </summary>
/// <remarks>
/// Metric is exposed as <see cref="ObservableCounter{T}"/> with value being <see cref="long"/>.
/// Expired values are one removed from cache because they were too old.
/// </remarks>
public const string Expirations = "rcache.expirations";
/// <summary>
/// Gets the total number of values added to cache.
/// </summary>
/// <remarks>
/// This value refers to total calls to set method, and does not include updates.
/// Metric is exposed as <see cref="ObservableCounter{T}"/> with value being <see cref="long"/>.
/// </remarks>
public const string Adds = "rcache.adds";
/// <summary>
/// Gets the total number of cache removals.
/// </summary>
/// <remarks>
/// This value refers to total calls to remove method, and does not include evictions.
/// If you are interested in metric of total values being removed from the cache, add <see cref="Evictions"/> and <see cref="Expirations"/>.
/// Metric is exposed as <see cref="ObservableCounter{T}"/> with value being <see cref="long"/>.
/// </remarks>
public const string Removals = "rcache.removals";
/// <summary>
/// Gets the total number of evicted values.
/// </summary>
/// <remarks>
/// Metric is exposed as <see cref="ObservableCounter{T}"/> with value being <see cref="long"/>.
/// Evicted values are those removed based on the implementation's eviction policy and not time expiration or intentional removal.
/// </remarks>
public const string Evictions = "rcache.evictions";
/// <summary>
/// Gets the total number of updated values.
/// </summary>
/// <remarks>
/// Metric is exposed as <see cref="ObservableCounter{T}"/> with value being <see cref="long"/>.
/// </remarks>
public const string Updates = "rcache.updates";
/// <summary>
/// Gets the total number of values in the cache over time.
/// </summary>
/// <remarks>
/// Metric is exposed as <see cref="ObservableGauge{T}"/> with value being <see cref="long"/>.
/// </remarks>
public const string Count = "rcache.entries";
/// <summary>
/// Gets the gauge of elements compacted in the cache.
/// </summary>
/// <remarks>
/// Metric is exposed as <see cref="Histogram{T}"/> with value being <see cref="long"/>.
/// </remarks>
public const string Compacted = "rcache.compacted_entries";
}
API Usage
Add default cache to DI.
IServiceCollection services;
services.AddRCacheLru<Guid, User>();
services.AddRCacheLru<Guid, User>("different", options => options.PublishMetrics = false);
Get default cache from DI.
namespace Example;
using Microsoft.Extensions.Cache;
public class UserService
{
private readonly RCache<Guid, User> _users;
public UserService(RCache<Guid, User> users)
{ ^^^^^^^^^^^^^^^^^^^^^^^^
_users = users
}
}
Create cache using builder.
namespace Example;
using System.Diagnostics.Metrics;
using Microsoft.Extensions.Cache;
public class UserService
{
private readonly RCache<Guid, User> _users;
public UserService(IMeterFactory meterFactory)
{
_users = new RCacheLruBuilder<Guid, User>("my-cache-name")
.WithMeterFactory(meterFactory)
.Build();
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
}
}
Alternative Designs
Callbacks and notifications
We could introduce a feature from Microsoft.Extensions.Caching.Memory
that allows to track if value was changed or implement eviction callbacks. Instead of storing a callback with each entry or some change tokens we could use System.Threading.Channels library. Through exposed ChannelReader, consumer could react on events like eviction, mutation and so on. I expect this design to be more memory/cpu efficient than existing one.
Size check
We could implement item size limit through interface implemented on stored type by library client.
public interface ISizer
{
int GetSizeInBytes();
}
This design would allow us to implement size check without boxing TValue or introducing CPU branches on hot-path.
Asynchronous interface
We wanted to be explicit that everything in RCache should be sync. This approach allows us not to go into distributed systems problems, since async caches requires much richer semantics. We are going to propose another interface in the future that covers asynchronous caches scenarios and more.
Risks
Another cache
This is yet another memory cache implementation which may confuse .NET community. We should be clear in framing the purpose of this interface, and make sure design is extendable enough so we don't need to repeat the same process in the future.
Layering
Is dotnet/extensions the right place for this code? For me it looks like a general purpose data structure that should be available for the whole stack. Should we introduce it to runtime repository?
Deferred removal
Cache item removal is deferred which can keep alive objects for longer than they need to be.