Store Python objects locally or in the cloud with the same, simple dictionary interface. CShelve lets you store and retrieve any Python object—lists, DataFrames, JSON, binary files—whether on local files, AWS S3, Azure Blob Storage, or in-memory, all with the same simple dictionary-like interface.
- Familiar & Fast – If you know Python dictionaries, you already know CShelve.
- Cloud-Ready – Switch between local files and cloud storage (AWS S3, Azure Blob, SFTP) with zero code changes.
- Lightweight – No database servers, no migrations, no schema headaches and minimals dependencies.
- Flexible Formats – Store pickled Python objects by default, or any format as bytes (JSON, CSV, Parquet, images, etc.).
- Cost-Effective Scaling – Tap into cheap and durable cloud storage without maintaining infrastructure.
Python’s built-in shelve module stores Python objects in a file with a dictionary-like API.
CShelve supercharges it with:
- Cloud backends
- Multiple authentication methods
- Format flexibility
- Provider-agnostic switching
If you can do:
mydict['key'] = valueYou can use CShelve—locally, in the cloud, or on-premises.
# Local storage only
pip install cshelve
# With AWS S3 support
pip install cshelve[aws-s3]
# With Azure Blob support
pip install cshelve[azure-blob]import cshelve
db = cshelve.open('local.db')
db['user'] = {'name': 'Alice', 'age': 30}
print(db['user']) # {'name': 'Alice', 'age': 30}
db.close()# Install provider
pip install cshelve[aws-s3]aws-s3.ini
[default]
provider = aws-s3
bucket_name = mybucket
auth_type = access_key
key_id = $AWS_KEY_ID
key_secret = $AWS_KEY_SECRETPython
import cshelve
db = cshelve.open('aws-s3.ini')
db['session'] = 'cloud storage is easy'
print(db['session'])
db.close()# Install provider
pip install cshelve[azure-blob]azure-blob.ini
[default]
provider = azure-blob
account_url = https://myaccount.blob.core.windows.net
auth_type = passwordless
container_name = mycontainerPython
import cshelve
db = cshelve.open('azure-blob.ini')
db['analytics'] = [1, 2, 3, 4]
print(db['analytics'])
db.close()import cshelve, pandas as pd
df = pd.DataFrame({'name': ['Alice', 'Bob'], 'age': [25, 30]})
with cshelve.open('azure-blob.ini') as db:
db['users'] = df
with cshelve.open('azure-blob.ini') as db:
print(db['users'])import json, cshelve
data = {"msg": "Hello, Cloud!"}
with cshelve.open('azure-blob.ini') as db:
db['config.json'] = json.dumps(data).encode()
with cshelve.open('azure-blob.ini') as db:
print(json.loads(db['config.json'].decode()))| Provider | Install Extra | Notes |
|---|---|---|
| Local | none | Stores data in a local .db file |
| AWS S3 | cshelve[aws-s3] |
Supports access_key auth |
| Azure Blob | cshelve[azure-blob] |
Supports access_key, passwordless, connection_string, anonymous auth |
| In-Memory | none | Perfect for tests and temporary storage |
Detailed configuration in the documentation.
We welcome pull requests, feature suggestions, and bug reports. Check the issues to get started.
MIT – see LICENSE
Switching from local to cloud to on-premises is as easy as:
db = cshelve.open('local.db')
# to
db = cshelve.open('aws-s3.ini')
# to
db = cshelve.open('sftp.ini')No code rewrite. Just change the config.