Skip to content

Is there a memory leak in bigcache? #369

Open
@Kaushal28

Description

I initially set a 10k keys and then for multiple times, I cleared half of them and set them again. In total the number of keys are constant (10k) all the time and all the key names and values are exactly the same. I just Delete() and Set() the same keys multiple times on the same bigcache instance.

When I did this 5 times, the memory profile (pprof) showed total of ~396MB memory being allocated at the end. But When I did the same 25 time, it showed ~2.26GB memory still allocated. Since I am clearing and then re-setting the key value pairs, shouldn't the memory utilization be same irrespective of number of time I do it as long as the key-value pair content and number of key-values are exactly same?

Why memory utilization increases as I do more Delete and Set?

Here is the script that I used to do it for better understanding and reproducibility:

import (
	"fmt"
	"strings"
	"time"

	"net/http"
	_ "net/http/pprof"

	"github.com/allegro/bigcache/v2"
)

const (
	count = 10000
)

// Just a helper to get a large value string/chunk.
func gen_chunk() []byte {
	var arr []string
	for i := 0; i < 1000; i++ {
		arr = append(arr, fmt.Sprintf("Test Value %d", i))
	}
	return []byte(strings.Join(arr, ", "))
}

func main() {
	cache, err := bigcache.NewBigCache(bigcache.DefaultConfig(7 * time.Hour))
	if err != nil {
		fmt.Printf("Error: %v\n", err.Error())
	}
	chunk := gen_chunk()
	go func() {
		// Set 10k keys with large values
		for i := 0; i < count; i++ {
			cache.Set(fmt.Sprintf("my-unique-key-%d", i), chunk)
		}

		for j := 0; j < 25; j++ {
			// Remove half of the keys
			for i := 0; i < count/2; i++ {
				cache.Delete(fmt.Sprintf("my-unique-key-%d", i))
			}

			// Again add those keys
			for i := 0; i < count/2; i++ {
				cache.Set(fmt.Sprintf("my-unique-key-%d", i), chunk)
			}
			// Empty all the cache
			// cache.Reset().     // Just FYI: When I do this after every iteration, the memory utilization stays constant.
			fmt.Printf("###: Itr: %d\n", j)
		}
		fmt.Println("### Done!")
	}()
	http.ListenAndServe("localhost:6060", nil)
}

Note that I am using github.com/allegro/bigcache/v2 v2.2.5 and this profiling is from http://localhost:6060/debug/pprof/heap

Here is the mem profile result when I did it 25 times:

Showing nodes accounting for 2.30GB, 99.89% of 2.30GB total
Dropped 19 nodes (cum <= 0.01GB)
Showing top 10 nodes out of 11
      flat  flat%   sum%        cum   cum%
    2.26GB 98.08% 98.08%     2.26GB 98.08%  github.com/allegro/bigcache/v2/queue.(*BytesQueue).allocateAdditionalMemory
    0.03GB  1.12% 99.20%     0.03GB  1.14%  github.com/allegro/bigcache/v2.initNewShard
    0.02GB  0.69% 99.89%     0.02GB  0.69%  github.com/allegro/bigcache/v2.wrapEntry (inline)
         0     0% 99.89%     2.27GB 98.77%  github.com/allegro/bigcache/v2.(*BigCache).Set
         0     0% 99.89%     2.27GB 98.77%  github.com/allegro/bigcache/v2.(*cacheShard).set
         0     0% 99.89%     0.03GB  1.14%  github.com/allegro/bigcache/v2.NewBigCache (inline)
         0     0% 99.89%     0.03GB  1.14%  github.com/allegro/bigcache/v2.newBigCache
         0     0% 99.89%     2.26GB 98.08%  github.com/allegro/bigcache/v2/queue.(*BytesQueue).Push
         0     0% 99.89%     0.03GB  1.14%  main.main
         0     0% 99.89%     2.27GB 98.77%  main.main.func1

Here is the mem profile result when I did it 5 times:

  393.97MB 72.72% 72.72%   393.97MB 72.72%  github.com/allegro/bigcache/v2/queue.(*BytesQueue).allocateAdditionalMemory
   93.98MB 17.35% 90.07%    93.98MB 17.35%  github.com/allegro/bigcache/v2/queue.NewBytesQueue (inline)
   31.99MB  5.90% 95.98%   125.97MB 23.25%  github.com/allegro/bigcache/v2.initNewShard
   19.30MB  3.56% 99.54%    19.30MB  3.56%  github.com/allegro/bigcache/v2.wrapEntry (inline)
         0     0% 99.54%   413.27MB 76.28%  github.com/allegro/bigcache/v2.(*BigCache).Set
         0     0% 99.54%   413.27MB 76.28%  github.com/allegro/bigcache/v2.(*cacheShard).set
         0     0% 99.54%   125.97MB 23.25%  github.com/allegro/bigcache/v2.NewBigCache (inline)
         0     0% 99.54%   125.97MB 23.25%  github.com/allegro/bigcache/v2.newBigCache
         0     0% 99.54%   393.97MB 72.72%  github.com/allegro/bigcache/v2/queue.(*BytesQueue).Push
         0     0% 99.54%   125.97MB 23.25%  main.main

Notice the difference in the memory utilization above. Is that a sign on memory leak? Please help.

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions