Experimental validation of five neuroscience-inspired learning mechanisms (STDP, homeostatic scaling, BCM metaplasticity, engram drift, neural recruitment) implemented in a Hebbian capability graph. 18 controlled experiments, 4 falsified hypotheses, 1 production graph analysis.
Paper draft: PAPER_DRAFT.md
A production capability graph with 73 nodes and 357K activations reached 99.8% edge weight saturation -- every edge at maximum, no discrimination between frequent and rare connections. The graph had remembered everything and could distinguish nothing.
We implemented six forgetting mechanisms drawn from neuroscience (temporal decay, homeostatic scaling, Hebbian unlearning, consolidation by similarity, stale edge pruning, contextual inhibition) and validated each through controlled experiments.
Strongest result: Removing STDP drops directional sequence discrimination from 100% to 48% (chance). Cohen's d = 3.163, p < 0.000001. Hebbian learning alone detects co-occurrence but cannot distinguish A->B from B->A.
Validated mechanisms:
- STDP provides 100% of directional awareness (ablation: 100% -> 48%)
- Homeostatic scaling prevents saturation (ablation: 14.45% ceiling without decay)
- BCM strengthens encoding by 58% (ablation: -58% encoding strength)
- Drift tracking confirmed as pure observer (zero learning impact, as designed)
- Neural recruitment improves bridge discovery by 121% (Cohen's d = 19.8)
Falsified hypotheses:
- Classical STDP timescales do not apply (tau = 999.5s vs. biological 20ms)
- Weight distribution is unimodal, not bimodal (no w_min clamp in source)
- Timescale ordering inverted from prediction
- Network topology is hub-and-spoke, not small-world (C_ratio = 1.19)
capability-graph-neuroscience/
PAPER_DRAFT.md Paper draft with full methodology and references
experiments/ 18 experiments, each with:
experiment_NN_*.py Python script (reproducible, seed=42)
experiment_NN_data.json Raw measurements
experiment_NN_plot.png 6-panel visualization
experiment_NN_report.md Full analysis report
docs/ Additional documentation
src/ Source modules under validation
analysis/ Analysis scripts
results/ Consolidated results
| # | Test | Result | Key metric |
|---|---|---|---|
| 1 | STDP kernel shape | PASS | Exact 2:1 potentiation bias (R^2 = 1.0) |
| 2 | STDP temporal precision | FAIL | tau = 999.5s, not classical 20s |
| # | Test | Result | Key metric |
|---|---|---|---|
| 3 | Weight distribution | FAIL | Unimodal, not bimodal |
| 4 | Convergence rate | PASS | 0.9 decay rate (R^2 = 1.0) |
| # | Test | Result | Key metric |
|---|---|---|---|
| 5 | Per-edge theta dynamics | PASS | 67% per-edge variance (live data) |
| 6 | Consolidation effect | PASS | 15.5x learning rate difference |
| # | Test | Result | Key metric |
|---|---|---|---|
| 7 | Pattern persistence | PASS | All constellations 100-160x below threshold |
| 8 | Drift velocity | PASS | Formula verified exactly (R^2 = 1.0) |
| # | Test | Result | Key metric |
|---|---|---|---|
| 9 | Perturbation recovery | PASS | 100% recovery in mean 79 events |
| 10 | Multi-timescale | FAIL | Timescale ordering inverted |
| 11 | Small-world topology | FAIL | Core-periphery (C_ratio = 1.19) |
| # | Test | Result | Key metric |
|---|---|---|---|
| 12 | Full system baseline | PASS | Convergence at event 1050 |
| 13 | Remove STDP | PASS | Order discrimination: 100% -> 48% |
| 14 | Remove homeostatic | PASS | 14.45% ceiling saturation |
| 15 | Remove BCM | PASS | -58% encoding strength |
| 16 | Remove drift | PASS | Zero learning impact (pure observer) |
| # | Test | Result | Key metric |
|---|---|---|---|
| 17 | Production graph analysis | PASS | 8 critical discoveries including edge saturation |
| 18 | Neural recruitment ablation | PASS | 121% bridge discovery improvement (d = 19.8) |
Overall: 13/18 PASS, 4/18 FAIL, 1 production analysis
Requires Python 3.12, NumPy, SciPy, Matplotlib.
cd experiments/
python experiment_01_stdp_kernel.pyEach script writes its own _data.json, _plot.png, and _report.md. Seeds are fixed at 42.
- Python 3.12
- NumPy / SciPy / Matplotlib
- Soul Matrix: Rust (512x512 Hebbian conductance matrix, ~112us inference)
- Capability Graph: Python (Hebbian + STDP + homeostatic scaling + BCM + recruitment)
CIPS Corp LLC -- February 2026
Contact: glass@cipscorps.io
MIT