Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
28 changes: 13 additions & 15 deletions AUTHORS.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,27 +2,25 @@



CEBRA was initially developed by **Mackenzie Mathis** and **Steffen Schneider** (2021+), who are co-inventors on the patent application [WO2023143843](https://infoscience.epfl.ch/entities/patent/0d9debed-4d22-47b7-bad1-f211e7010323).
**Jin Hwa Lee** contributed significantly to our first paper:
CEBRA was initially developed by **Mackenzie Mathis** and **Steffen Schneider** (2021+), who are co-inventors on the patent application [WO2023143843](https://infoscience.epfl.ch/entities/patent/0d9debed-4d22-47b7-bad1-f211e7010323).
**Jin Hwa Lee** contributed significantly to our first paper:

> **Schneider, S., Lee, J.H., & Mathis, M.W.**
> [*Learnable latent embeddings for joint behavioural and neural analysis.*](https://doi.org/10.1038/s41586-023-06031-6)
> **Schneider, S., Lee, J.H., & Mathis, M.W.**
> [*Learnable latent embeddings for joint behavioural and neural analysis.*](https://doi.org/10.1038/s41586-023-06031-6)
> Nature 617, 360–368 (2023)

CEBRA is actively developed by [**Mackenzie Mathis**](https://www.mackenziemathislab.org/) and [**Steffen Schneider**](https://dynamical-inference.ai/) and their labs.
CEBRA is actively developed by [**Mackenzie Mathis**](https://www.mackenziemathislab.org/) and [**Steffen Schneider**](https://dynamical-inference.ai/) and their labs.

It is a publicly available tool that has benefited from contributions and suggestions from many individuals: [CEBRA/graphs/contributors](https://github.com/AdaptiveMotorControlLab/CEBRA/graphs/contributors).
It is a publicly available tool that has benefited from contributions and suggestions from many individuals: [CEBRA/graphs/contributors](https://github.com/AdaptiveMotorControlLab/CEBRA/graphs/contributors).

## CEBRA Extensions
## CEBRA Extensions

### 2023
- **Steffen Schneider, Rodrigo González Laiz, Markus Frey, Mackenzie W. Mathis**
[*Identifiable attribution maps using regularized contrastive learning.*](https://sslneurips23.github.io/paper_pdfs/paper_80.pdf)
### 2023
- **Steffen Schneider, Rodrigo González Laiz, Markus Frey, Mackenzie W. Mathis**
[*Identifiable attribution maps using regularized contrastive learning.*](https://sslneurips23.github.io/paper_pdfs/paper_80.pdf)
NeurIPS 4th Workshop on Self-Supervised Learning: Theory and Practice (2023)

### 2025
- **Steffen Schneider, Rodrigo González Laiz, Anastasiia Filippova, Markus Frey, Mackenzie W. Mathis**
[*Time-series attribution maps with regularized contrastive learning.*](https://openreview.net/forum?id=aGrCXoTB4P)
### 2025
- **Steffen Schneider, Rodrigo González Laiz, Anastasiia Filippova, Markus Frey, Mackenzie W. Mathis**
[*Time-series attribution maps with regularized contrastive learning.*](https://openreview.net/forum?id=aGrCXoTB4P)
AISTATS (2025)


2 changes: 1 addition & 1 deletion Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@ RUN make dist
FROM cebra-base

# install the cebra wheel
ENV WHEEL=cebra-0.5.0rc1-py3-none-any.whl
ENV WHEEL=cebra-0.5.0-py3-none-any.whl
WORKDIR /build
COPY --from=wheel /build/dist/${WHEEL} .
RUN pip install --no-cache-dir ${WHEEL}'[dev,integrations,datasets]'
Expand Down
2 changes: 1 addition & 1 deletion Makefile
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
CEBRA_VERSION := 0.5.0rc1
CEBRA_VERSION := 0.5.0

dist:
python3 -m pip install virtualenv
Expand Down
2 changes: 1 addition & 1 deletion PKGBUILD
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
# Maintainer: Steffen Schneider <stes@hey.com>
pkgname=python-cebra
_pkgname=cebra
pkgver=0.5.0rc1
pkgver=0.5.0
pkgrel=1
pkgdesc="Consistent Embeddings of high-dimensional Recordings using Auxiliary variables"
url="https://cebra.ai"
Expand Down
2 changes: 1 addition & 1 deletion cebra/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -66,7 +66,7 @@

import cebra.integrations.sklearn as sklearn

__version__ = "0.5.0rc1"
__version__ = "0.5.0"
__all__ = ["CEBRA"]
__allow_lazy_imports = False
__lazy_imports = {}
Expand Down
25 changes: 13 additions & 12 deletions cebra/integrations/plotly.py
Original file line number Diff line number Diff line change
Expand Up @@ -27,8 +27,8 @@
import numpy as np
import numpy.typing as npt
import plotly.graph_objects
import torch
import plotly.graph_objects as go
import torch

from cebra.integrations.matplotlib import _EmbeddingPlot

Expand Down Expand Up @@ -153,17 +153,18 @@ def _plot_3d(self, **kwargs) -> plotly.graph_objects.Figure:


def plot_embedding_interactive(
embedding: Union[npt.NDArray, torch.Tensor],
embedding_labels: Optional[Union[npt.NDArray, torch.Tensor, str]] = "grey",
axis: Optional["go.Figure"] = None,
markersize: float = 1,
idx_order: Optional[Tuple[int]] = None,
alpha: float = 0.4,
cmap: str = "cool",
title: str = "Embedding",
figsize: Tuple[int] = (5, 5),
dpi: int = 100,
**kwargs,
embedding: Union[npt.NDArray, torch.Tensor],
embedding_labels: Optional[Union[npt.NDArray, torch.Tensor,
str]] = "grey",
axis: Optional["go.Figure"] = None,
markersize: float = 1,
idx_order: Optional[Tuple[int]] = None,
alpha: float = 0.4,
cmap: str = "cool",
title: str = "Embedding",
figsize: Tuple[int] = (5, 5),
dpi: int = 100,
**kwargs,
) -> "go.Figure":
"""Plot embedding in a 3D dimensional space.

Expand Down
30 changes: 15 additions & 15 deletions docs/root/index.html
Original file line number Diff line number Diff line change
Expand Up @@ -145,16 +145,16 @@ <h4>
</div>

<div class="row mb-5 mt-4">
<p>CEBRA is a machine-learning method that can be used to
<p>CEBRA is a machine-learning method that can be used to
compress time series in a way that reveals otherwise hidden
structures in the variability of the data. It excels on
behavioural and neural data recorded simultaneously.
structures in the variability of the data. It excels on
behavioural and neural data recorded simultaneously.
We have shown it can be used to decode the activity from the
visual cortex of the mouse brain to reconstruct a viewed video,
to decode trajectories from the sensoirmotor cortex of primates,
and for decoding position during navigation. For these use cases
and other demos see our <a href="https://cebra.ai/docs/" style="color: #6235E0;">Documentation</a>.</p>

</div>

<div class="row">
Expand All @@ -171,12 +171,12 @@ <h3><i class="fas fa-play-circle"></i> Demo Applications</h3>
<div class="col-md-6 mb-2">
<!-- Embedding the Plotly figure using iframe -->
<div style="position: relative; height: 315px; overflow: hidden; margin-bottom: 1rem;">
<iframe src="static/img/hippocampus_posdir3_full.html"
<iframe src="static/img/hippocampus_posdir3_full.html"
style="position: absolute; top: -140px; left: -5%; width: 110%; height: 150%; border: none; transform: scale(0.85); transform-origin: top center;"
scrolling="no">
</iframe>
</div>

<p style="margin-top: -70px;">Interactive visualization of the CEBRA embedding for the rat hippocampus data. This 3D plot shows how neural activity is mapped to a lower-dimensional space that correlates with the animal's position and movement direction. <a href="https://colab.research.google.com/github/AdaptiveMotorControlLab/CEBRA-demos/blob/main/Demo_hippocampus.ipynb" target="_blank" style="color: #6235E0;"><i class="fas fa-external-link-alt"></i> Open In Colaboratory</a></p>
</div>
</div>
Expand All @@ -191,7 +191,7 @@ <h3><i class="fas fa-play-circle"></i> Demo Applications</h3>
<p>CEBRA applied to mouse primary visual cortex, collected at the Allen Institute (de Vries et al. 2020, Siegle et al. 2021). 2-photon and Neuropixels recordings are embedded with CEBRA using DINO frame features as labels.
The embedding is used to decode the video frames using a kNN decoder on the CEBRA-Behavior embedding from the test set.</p>
</div>

<div class="col-md-6 mb-2">
<!-- YouTube embed for CEBRA on M1 and S1 neural data with cleaner styling -->
<video width="100%" autoplay loop muted preload="auto">
Expand All @@ -205,7 +205,7 @@ <h3><i class="fas fa-play-circle"></i> Demo Applications</h3>

<div class="row mt-4">
<h3><i class="fas fa-newspaper"></i> Publications</h3>

<div class="col-12">
<div class="paper-card">
<div class="paper-title">Learnable latent embeddings for joint behavioural and neural analysis</div>
Expand All @@ -215,7 +215,7 @@ <h3><i class="fas fa-newspaper"></i> Publications</h3>
<a href="https://arxiv.org/abs/2204.00673" target="_blank" class="btn btn-link" style="color: #6235E0;"><i class="fas fa-file-alt"></i> Preprint</a>
</div>
</div>

<div class="col-12">
<div class="paper-card">
<div class="paper-title">Time-series attribution maps with regularized contrastive learning</div>
Expand All @@ -230,7 +230,7 @@ <h3><i class="fas fa-newspaper"></i> Publications</h3>

<div class="row mt-4">
<h3><i class="fas fa-certificate"></i> Patent Information</h3>

<div class="col-12">
<div class="paper-card">
<div class="paper-title">Patent Pending</div>
Expand Down Expand Up @@ -299,7 +299,7 @@ <h3>
</small>
</div>
</div>

<div class="row justify-content-md-center">
<div class="col-sm-10 rounded p-3 m-2" style="background-color: rgb(20,20,20);">
<small class="code">
Expand All @@ -325,13 +325,13 @@ <h3>
<p>
CEBRA has been cited in numerous high-impact publications across neuroscience, machine learning, and related fields. Our work has influenced research in neural decoding, brain-computer interfaces, computational neuroscience, and machine learning methods for time-series analysis.
</p>

<div class="col-12 text-center mb-4">
<a href="https://scholar.google.com/scholar?oi=bibs&hl=en&cites=5385393104765622341&as_sdt=5" target="_blank" class="btn btn-outline-light btn-lg">
<i class="fas fa-graduation-cap"></i> View All Citations on Google Scholar
</a>
</div>

<div class="col-12">
<div class="paper-card">
<p class="mb-0">Our research has been cited in proceedings and journals including <span class="badge bg-light text-dark">Nature</span> <span class="badge bg-light text-dark">Science</span> <span class="badge bg-light text-dark">ICML</span> <span class="badge bg-light text-dark">Nature Neuroscience</span> <span class="badge bg-light text-dark">ICML</span> <span class="badge bg-light text-dark">Neuron</span> <span class="badge bg-light text-dark">NeurIPS</span> <span class="badge bg-light text-dark">ICLR</span> and others.</p>
Expand All @@ -342,8 +342,8 @@ <h3>
<div class="row justify-content-center mt-5 mb-3">
<div class="col-md-12 text-center">
<a href="https://www.epfl.ch/" target="_blank">
<img src="https://images.squarespace-cdn.com/content/v1/57f6d51c9f74566f55ecf271/00b1fa45-9246-4914-86ee-4a01bb3fb60b/logo.png?format=2500w"
alt="MLAI Logo"
<img src="https://images.squarespace-cdn.com/content/v1/57f6d51c9f74566f55ecf271/00b1fa45-9246-4914-86ee-4a01bb3fb60b/logo.png?format=2500w"
alt="MLAI Logo"
style="max-width: 600px;">
</a>
<div class="mt-3">
Expand Down
9 changes: 5 additions & 4 deletions docs/source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -81,7 +81,6 @@ def get_years(start_year=2021):
"sphinx_gallery.load_style",
]


coverage_show_missing_items = True
panels_add_bootstrap_css = False

Expand Down Expand Up @@ -147,7 +146,8 @@ def get_years(start_year=2021):
html_context = {
"default_mode": "light",
"switcher": {
"version_match": "latest", # Adjust this dynamically per version
"version_match":
"latest", # Adjust this dynamically per version
"versions": [
("latest", "/latest/"),
("v0.2.0", "/v0.2.0/"),
Expand All @@ -156,7 +156,8 @@ def get_years(start_year=2021):
("v0.5.0rc1", "/v0.5.0rc1/"),
],
},
"navbar_start": ["version-switcher", "navbar-logo"], # Place the dropdown above the logo
"navbar_start": ["version-switcher",
"navbar-logo"], # Place the dropdown above the logo
}

# More info on theme options:
Expand Down Expand Up @@ -220,7 +221,7 @@ def get_years(start_year=2021):

nbsphinx_thumbnails = {
"demo_notebooks/CEBRA_best_practices":
"_static/thumbnails/cebra-best.png",
"_static/thumbnails/cebra-best.png",
"demo_notebooks/Demo_primate_reaching":
"_static/thumbnails/ForelimbS1.png",
"demo_notebooks/Demo_hippocampus":
Expand Down
32 changes: 16 additions & 16 deletions docs/source/usage.rst
Original file line number Diff line number Diff line change
Expand Up @@ -1218,36 +1218,36 @@ Putting all previous snippet examples together, we obtain the following pipeline
output_dimension = 8,
verbose = False
)

# 2. Load example data
neural_data = cebra.load_data(file="neural_data.npz", key="neural")
new_neural_data = cebra.load_data(file="neural_data.npz", key="new_neural")
continuous_label = cebra.load_data(file="auxiliary_behavior_data.h5", key="auxiliary_variables", columns=["continuous1", "continuous2", "continuous3"])
discrete_label = cebra.load_data(file="auxiliary_behavior_data.h5", key="auxiliary_variables", columns=["discrete"]).flatten()


assert neural_data.shape == (100, 3)
assert new_neural_data.shape == (100, 4)
assert discrete_label.shape == (100, )
assert continuous_label.shape == (100, 3)

# 3. Split data and labels into train/validation
from sklearn.model_selection import train_test_split

split_idx = int(0.8 * len(neural_data))
# suggestion: 5%-20% depending on your dataset size; note that this splits the
# into an early and late part, which might not be ideal for your data/experiment!
# As a more involved alternative, consider e.g. a nested time-series split.

train_data = neural_data[:split_idx]
valid_data = neural_data[split_idx:]

train_continuous_label = continuous_label[:split_idx]
valid_continuous_label = continuous_label[split_idx:]

train_discrete_label = discrete_label[:split_idx]
valid_discrete_label = discrete_label[split_idx:]

# 4. Fit the model
# time contrastive learning
cebra_model.fit(train_data)
Expand All @@ -1257,29 +1257,29 @@ Putting all previous snippet examples together, we obtain the following pipeline
cebra_model.fit(train_data, train_continuous_label)
# mixed behavior contrastive learning
cebra_model.fit(train_data, train_discrete_label, train_continuous_label)


# 5. Save the model
tmp_file = Path(tempfile.gettempdir(), 'cebra.pt')
cebra_model.save(tmp_file)

# 6. Load the model and compute an embedding
cebra_model = cebra.CEBRA.load(tmp_file)
train_embedding = cebra_model.transform(train_data)
valid_embedding = cebra_model.transform(valid_data)

assert train_embedding.shape == (80, 8) # TODO(user): change to split ratio & output dim
assert valid_embedding.shape == (20, 8) # TODO(user): change to split ratio & output dim

# 7. Evaluate the model performance (you can also check the train_data)
goodness_of_fit = cebra.sklearn.metrics.goodness_of_fit_score(cebra_model,
valid_data,
valid_discrete_label,
valid_continuous_label)

# 8. Adapt the model to a new session
cebra_model.fit(new_neural_data, adapt = True)

# 9. Decode discrete labels behavior from the embedding
decoder = cebra.KNNDecoder()
decoder.fit(train_embedding, train_discrete_label)
Expand Down
2 changes: 1 addition & 1 deletion reinstall.sh
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ pip uninstall -y cebra
# Get version info after uninstalling --- this will automatically get the
# most recent version based on the source code in the current directory.
# $(tools/get_cebra_version.sh)
VERSION=0.5.0rc1
VERSION=0.5.0
echo "Upgrading to CEBRA v${VERSION}"

# Upgrade the build system (PEP517/518 compatible)
Expand Down
4 changes: 3 additions & 1 deletion setup.cfg
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,9 @@ install_requires =
scipy
torch>=2.4.0
tqdm
matplotlib
# NOTE(stes): Remove pin once https://github.com/AdaptiveMotorControlLab/CEBRA/issues/240
# is resolved.
matplotlib<3.11
requests

[options.extras_require]
Expand Down
Loading
Loading