A fast and memory-optimized string library for heavy-text manipulation in Python
-
Updated
Apr 22, 2020 - Python
A fast and memory-optimized string library for heavy-text manipulation in Python
This code repository contains the code used for my "Optimizing Memory Usage for Training LLMs and Vision Transformers in PyTorch" blog post.
MemoRizz: A Python library serving as a memory layer for AI applications. Leverages popular databases and storage solutions to optimize memory usage. Provides utility classes and methods for efficient data management, including MongoDB integration and OpenAI embeddings for semantic search capabilities.
Automatically reduce the memory size of any pandas dataframe based on downcasting bit types efficiently
Training a lightweight GPT model (12.3M parameters) optimized for consumer GPUs with 8GB VRAM like the RTX 3050.
Hexel is an AI-native OS enabling autonomous decision-making and multi-agent collaboration across various deployments.
Smaller Arrays Implementations fully built in python 3.8
Enhanced Convolutional Neural Network Accelerators with Memory Optimization for Routing Applications
Reverse Overflow Multiple Stack: Space-efficient dynamic stack array structure
Add a description, image, and links to the memory-optimization topic page so that developers can more easily learn about it.
To associate your repository with the memory-optimization topic, visit your repo's landing page and select "manage topics."