-
Notifications
You must be signed in to change notification settings - Fork 9
Closed
Description
With a simple test, I have found out that the memory usage of the model is actually much lower than expected. Result as following:
full > 175MB > lite > 49MB
Quote:
Possible memory usage
This library requires at least 200MB memory in low-memory mode.
With 4 times less memory consumption than expectation, changing the expected memory usage section is worth considering. 49MB is small enough for some embedding system and 200MB is actually enough for loading the full model.
test script
from src import fast_langdetect
import logging
logging.basicConfig(level=logging.DEBUG, format='%(levelname)s:%(name)s:%(message)s')
import os, sys, resource
limit = <Memory Limit in MB> * 1024 * 1024
resource.setrlimit(resource.RLIMIT_AS, (limit, limit))
print("Testing Memory Consumption:")
result = fast_langdetect.detect('Hello world', low_memory=False)
print('Result:', result)
Metadata
Metadata
Assignees
Labels
No labels