Skip to content

DimitriTimoz/manopt-rs

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

20 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

manopt-rs

Crates.io Documentation License

A high-performance Rust library for manifold optimization built on top of the Burn deep learning framework. This library provides Riemannian optimization algorithms and manifold structures for constrained optimization problems.

Features

  • Riemannian Optimization Algorithms: Modern optimizers adapted for manifold constraints
    • Riemannian Adam (RiemannianAdam)
    • Riemannian Gradient Descent (ManifoldRGD)
  • Multiple Manifolds: Built-in support for common manifold structures
    • Euclidean spaces
    • WIP
  • Backend Flexibility: Works with any Burn backend (NDArray, Torch, WGPU, etc.)
  • Type Safety: Leverages Rust's type system for safe tensor operations
  • High Performance: Built on Burn's efficient tensor operations

Installation

Add this to your Cargo.toml:

[dependencies]
manopt-rs = "0.1"

# Example with Burn backend
burn = { version = "0.17", features = ["tch", "autodiff", "ndarray"] }

Quick Start

use manopt_rs::prelude::*;
use burn::optim::SimpleOptimizer;

fn main() {
    // Configure Riemannian Adam optimizer
    let config = RiemannianAdamConfig::<Euclidean, burn::backend::NdArray>::new()
        .with_lr(0.01)
        .with_beta1(0.9)
        .with_beta2(0.999);

    let optimizer = RiemannianAdam::new(config);

    // Create optimization problem: minimize ||x - target||²
    let target = Tensor::<burn::backend::NdArray, 1>::from_floats([2.0, -1.0, 3.0], &Default::default());
    let mut x = Tensor::<burn::backend::NdArray, 1>::zeros([3], &Default::default());
    let mut state = None;

    // Optimization loop
    for _step in 0..100 {
        let grad = (x.clone() - target.clone()) * 2.0;
        let (new_x, new_state) = optimizer.step(1.0, x.clone(), grad, state);
        x = new_x;
        state = new_state;
    }

    println!("Optimized result: {}", x);
}

Examples

Basic Optimization

Run a simple quadratic optimization example:

cargo run --example optimization_demo

This demonstrates minimizing a quadratic function using Riemannian Adam.

Riemannian Adam Demo

Test the Riemannian Adam optimizer:

cargo run --example riemannian_adam_demo

Architecture

Manifolds

The library is built around the Manifold trait, which defines the geometric structure:

pub trait Manifold<B: Backend>: Clone + Send + Sync {
    fn project<const D: usize>(point: Tensor<B, D>, vector: Tensor<B, D>) -> Tensor<B, D>;
    fn retract<const D: usize>(point: Tensor<B, D>, direction: Tensor<B, D>) -> Tensor<B, D>;
    fn inner<const D: usize>(point: Tensor<B, D>, u: Tensor<B, D>, v: Tensor<B, D>) -> Tensor<B, D>;
    // ... more methods
}

Optimizers

Riemannian optimizers that respect manifold constraints:

  • RiemannianAdam: Adam optimizer adapted for Riemannian manifolds
  • ManifoldRGD: Riemannian gradient descent

Supported Manifolds

  • Euclidean: Standard unconstrained optimization
  • 🚧 Stiefel: Matrices with orthonormal columns (in development)
  • 📋 Planned: Grassmann, Symmetric Positive Definite, Sphere

Contributing

Contributions are welcome! Please feel free to submit a Pull Request. For major changes, please open an issue first to discuss what you would like to change.

Development Setup

  1. Clone the repository:

    git clone https://github.com/DimitriTimoz/manopt-rs.git
    cd manopt-rs
  2. Install dependencies:

    cargo build
  3. Run tests:

    cargo test

🔗 Related Projects

  • Manopt: MATLAB toolbox for optimization on manifolds
  • Pymanopt: Python toolbox for optimization on manifolds
  • Burn: Deep learning framework in Rust

License

This project is licensed under the MIT License - see the LICENSE file for details.

Acknowledgments

  • Inspired by the Manopt toolbox
  • Built on the excellent Burn framework
  • Thanks to the Rust community for their amazing ecosystem

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages