Skip to content

A self-contained, libre website designed to demonstrate and explain modern machine learning techniques, technologies, and methods.

License

Notifications You must be signed in to change notification settings

Fedele-AI/MLvisualizer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

56 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

ML Visualizer πŸ€–

License Made with Rust WebAssembly

Interactive demonstrations of AI and machine learning architectures

πŸš€ Launch Demo β€’ πŸ“– Documentation β€’ 🎡 Music Transformer


Usage

  1. 🌐 Launch the visualizer by opening index.html, or visit mlvisualizer.org
  2. 🎯 Select a demo from the homepage
  3. πŸŽ›οΈ Interact with the controls to adjust parameters
  4. πŸ‘€ Watch the neural network learn and adapt in real-time

Learn More

Each demo includes:

  • πŸ“Š Real-time visualizations - Watch neural networks in action
  • πŸŽ›οΈ Interactive controls - Adjust parameters and see immediate results
  • πŸ“ˆ Performance metrics - Track learning progress and accuracy
  • πŸ’‘ Educational descriptions - Understand the theory behind each model

Tech Stack

Technology Purpose
🌐 HTML5 Structure & Markup
🎨 CSS3 Styling & Animations
⚑ Vanilla JavaScript Interactive Visualizations
πŸ¦€ Rust + WebAssembly High-Performance AI Audio Generation

File structure

Repository tree (top-level). Use this as a quick reference:

MLvisualizer/
β”œβ”€β”€ .gitignore               # Files and patterns ignored by Git
β”œβ”€β”€ .nojekyll                # Disable Jekyll processing on GitHub Pages
β”œβ”€β”€ .github/                 # GitHub actions, issue templates, and CI configs
β”œβ”€β”€ Cargo.toml               # Rust crate manifest (dependencies and metadata)
β”œβ”€β”€ Cargo.lock               # Locked dependency versions for reproducible builds
β”œβ”€β”€ LICENSE.md               # Project license (GPL-3.0)
β”œβ”€β”€ sitemap.xml              # Project sitemap for search engines
β”œβ”€β”€ sitemap.xsl              # Stylesheet for sitemap.xml
β”œβ”€β”€ README.md                # Project overview and documentation (this file)
β”œβ”€β”€ index.html               # Main landing page for the demo
β”œβ”€β”€ CNAME                    # Custom domain name for GitHub Pages
β”œβ”€β”€ robots.txt               # Search engine crawling instructions
β”œβ”€β”€ css/                     # Stylesheets
β”‚   β”œβ”€β”€ styles.css           # Global site styles
β”‚   └── neural-music.css     # Styles for the music demo
β”œβ”€β”€ html/                    # Standalone HTML demo pages and fragments
β”‚   β”œβ”€β”€ neural-music.html    # Music Transformer demo page (audio + visualization)
β”‚   └── jslicense.html       # License fragment included in HTML pages
β”œβ”€β”€ js/                      # Front-end JavaScript (visualizers and site scripts)
β”‚   β”œβ”€β”€ core.js              # Core visualization helpers and utilities
β”‚   β”œβ”€β”€ script.js            # Site initialization and glue code
β”‚   β”œβ”€β”€ neural-music.js      # Music demo UI and integration with wasm pkg
β”‚   β”œβ”€β”€ visualizers-basic.js # Basic visualization implementations
β”‚   └── visualizers-advanced.js # Advanced visualization implementations
β”œβ”€β”€ pkg/                     # Generated WebAssembly artifacts and JS wrappers
β”‚   β”œβ”€β”€ music_transformer.js # Browser import wrapper for the wasm module
β”‚   β”œβ”€β”€ music_transformer.d.ts # TypeScript definitions for the wrapper
β”‚   β”œβ”€β”€ music_transformer_bg.wasm # Compiled wasm binary
β”‚   β”œβ”€β”€ music_transformer_bg.wasm.d.ts # Wasm type declarations
β”‚   └── package.json         # pkg metadata for the generated package
β”œβ”€β”€ src/                     # Rust source code for the Music Transformer
β”‚   └── lib.rs               # Core Rust implementation and wasm bindings
β”œβ”€β”€ target/                  # Cargo build artifacts and compiled outputs
└── tools/                   # Build and maintenance utilities
	β”œβ”€β”€ build.sh             # Local build/packaging helper script
	└── CODE_OF_CONDUCT.md   # Contribution guidelines and conduct policy

Notes:

  • pkg/ and target/ are generated build outputs. Do not edit files in these folders directly; change the source files under src/, js/, css/, and html/ and regenerate artifacts via the build scripts.
  • pkg/ contains the WebAssembly wrapper and related artifacts used by the browser; it includes generated binaries, JS wrappers, and TypeScript definitions. The bark.wav file is an easter-egg and is not needed for normal development or packaging.

Neural Music Transformer Demo

This project includes a real, working transformer-like sequence model implemented in Rust and compiled to WebAssembly for the browser. It uses an attention mechanism over previously generated notes to produce short melodic phrases, and then synthesizes audio samples on the fly.

Important context:

  • Educational Model: A minimal, hand-crafted model for education and fun. There's no training, no large parameter matrices, and no text tokens.
  • Real Transformer: It does use attention over a sequence, so it's "a real transformer" in spirit, but it's tiny and domain-specific (8-note pentatonic scale + rests), nothing like the multi-billion-parameter models powering systems like ChatGPT.
  • Deterministic Output: Generated deterministically with small randomness from the browser's Math.random().

What it does

  • Generates a musical note sequence with simple attention: recent positions get higher weight, and consonant intervals are biased
  • Optionally inserts rests to create phrases and supports different envelope "instruments"
  • Converts the sequence to audio samples (Float32) you can play with the Web Audio API

Quick start (browser, ESM)

<script type="module">
	import init, { MusicTransformer, InstrumentType } from './pkg/music_transformer.js';

	async function main() {
		// Load the WASM module
		await init();

		// Create transformer and configure
		const mt = new MusicTransformer();
		mt.set_melodic(true);                 // smoother stepwise motion
		mt.set_random_spacing(true);          // insert rests between phrases
		mt.set_instrument(InstrumentType.Piano);
		mt.set_tempo(110);                    // BPM, clamped to [60, 240]
		mt.set_target_duration(12);           // seconds, clamped to [5, 30]

		// Use your AudioContext’s sample rate for perfect playback
		const audioCtx = new (window.AudioContext || window.webkitAudioContext)();
		mt.set_sample_rate(audioCtx.sampleRate);

		// Generate music
		const sequence = mt.generate_sequence();
		// sequence is a Uint32Array of note indices (0..7) and rest markers (999)

		const samples = mt.generate_audio(); // Float32Array mono samples

		// Play via Web Audio API
		const buffer = audioCtx.createBuffer(1, samples.length, audioCtx.sampleRate);
		buffer.getChannelData(0).set(samples);
		const src = audioCtx.createBufferSource();
		src.buffer = buffer;
		src.connect(audioCtx.destination);
		src.start();
	}

	main();
	// Note: Browsers may require a user gesture before starting AudioContext
</script>

Rust core (excerpt)

The Rust core exposes a MusicTransformer with a tiny attention-based generator and a simple synthesizer with ADSR envelopes per instrument:

#[wasm_bindgen]
pub struct MusicTransformer { /* ... */ }

#[wasm_bindgen]
impl MusicTransformer {
		#[wasm_bindgen(constructor)]
		pub fn new() -> MusicTransformer { /* init scale, defaults */ }

		// Configuration setters
		pub fn set_melodic(&mut self, melodic: bool);
		pub fn set_random_spacing(&mut self, on: bool);
		pub fn set_instrument(&mut self, instrument: InstrumentType);
		pub fn set_tempo(&mut self, bpm: f32);              // Clamped to [60, 240]
		pub fn set_sample_rate(&mut self, rate: f32);       // Clamped to [22050, 48000]
		pub fn set_target_duration(&mut self, seconds: f32); // Clamped to [5, 30]

		// Configuration getters
		pub fn get_sample_rate(&self) -> f32;
		pub fn get_target_duration(&self) -> f32;
		pub fn get_sequence(&self) -> Vec<usize>;           // Returns current sequence
		pub fn get_duration(&self) -> f32;                  // Returns actual duration in seconds

		// Generation methods
		pub fn generate_sequence(&mut self) -> Vec<usize>;  // Returns note indices (0..7) and rest marker 999
		pub fn generate_audio(&self) -> Vec<f32>;           // Returns mono audio samples for current sequence
}

Notes:

  • Sequence values of 999 represent rests (silence)
  • Instruments are simple envelopes/harmonics: Robo (synth), Piano, Guitar
  • Attention weights emphasize nearby positions; "melodic" mode increases preference for small intervals

πŸ”¬ How It Differs from Large LLMs

Aspect This Demo Large LLMs (e.g., ChatGPT)
πŸ“ Scale A few functions and tiny arrays running in your browser Billions of parameters on GPU/TPU clusters
πŸŽ“ Training Not trained; rule-guided Trained on massive datasets
🎭 Modality Outputs notes and synthesized waveforms Operates on text tokens (and sometimes images/audio) using very large vocabularies

This demo is designed to help you peek inside the mechanics - attention, sequencing, and synthesis - without the complexity of production-grade models.


πŸ“„ License

GNU General Public License v3.0 - See LICENSE.MD for details.


Built with ❀️ for CEE 4803 at Georgia Tech

🌟 Star this repo if you find it helpful!

About

A self-contained, libre website designed to demonstrate and explain modern machine learning techniques, technologies, and methods.

Resources

License

Code of conduct

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published