Skip to content

For-Sunny/capability-graph-neuroscience

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Capability Graph Neuroscience Validation

Experimental validation of five neuroscience-inspired learning mechanisms (STDP, homeostatic scaling, BCM metaplasticity, engram drift, neural recruitment) implemented in a Hebbian capability graph. 18 controlled experiments, 4 falsified hypotheses, 1 production graph analysis.

Paper draft: PAPER_DRAFT.md


What we found

A production capability graph with 73 nodes and 357K activations reached 99.8% edge weight saturation -- every edge at maximum, no discrimination between frequent and rare connections. The graph had remembered everything and could distinguish nothing.

We implemented six forgetting mechanisms drawn from neuroscience (temporal decay, homeostatic scaling, Hebbian unlearning, consolidation by similarity, stale edge pruning, contextual inhibition) and validated each through controlled experiments.

Strongest result: Removing STDP drops directional sequence discrimination from 100% to 48% (chance). Cohen's d = 3.163, p < 0.000001. Hebbian learning alone detects co-occurrence but cannot distinguish A->B from B->A.

Validated mechanisms:

  • STDP provides 100% of directional awareness (ablation: 100% -> 48%)
  • Homeostatic scaling prevents saturation (ablation: 14.45% ceiling without decay)
  • BCM strengthens encoding by 58% (ablation: -58% encoding strength)
  • Drift tracking confirmed as pure observer (zero learning impact, as designed)
  • Neural recruitment improves bridge discovery by 121% (Cohen's d = 19.8)

Falsified hypotheses:

  • Classical STDP timescales do not apply (tau = 999.5s vs. biological 20ms)
  • Weight distribution is unimodal, not bimodal (no w_min clamp in source)
  • Timescale ordering inverted from prediction
  • Network topology is hub-and-spoke, not small-world (C_ratio = 1.19)

Repository contents

capability-graph-neuroscience/
  PAPER_DRAFT.md              Paper draft with full methodology and references
  experiments/                18 experiments, each with:
    experiment_NN_*.py          Python script (reproducible, seed=42)
    experiment_NN_data.json     Raw measurements
    experiment_NN_plot.png      6-panel visualization
    experiment_NN_report.md     Full analysis report
  docs/                       Additional documentation
  src/                        Source modules under validation
  analysis/                   Analysis scripts
  results/                    Consolidated results

Experiments

STDP Learning

# Test Result Key metric
1 STDP kernel shape PASS Exact 2:1 potentiation bias (R^2 = 1.0)
2 STDP temporal precision FAIL tau = 999.5s, not classical 20s

Homeostatic Scaling

# Test Result Key metric
3 Weight distribution FAIL Unimodal, not bimodal
4 Convergence rate PASS 0.9 decay rate (R^2 = 1.0)

BCM Metaplasticity

# Test Result Key metric
5 Per-edge theta dynamics PASS 67% per-edge variance (live data)
6 Consolidation effect PASS 15.5x learning rate difference

Engram Drift

# Test Result Key metric
7 Pattern persistence PASS All constellations 100-160x below threshold
8 Drift velocity PASS Formula verified exactly (R^2 = 1.0)

System Integration

# Test Result Key metric
9 Perturbation recovery PASS 100% recovery in mean 79 events
10 Multi-timescale FAIL Timescale ordering inverted
11 Small-world topology FAIL Core-periphery (C_ratio = 1.19)

Ablation Studies

# Test Result Key metric
12 Full system baseline PASS Convergence at event 1050
13 Remove STDP PASS Order discrimination: 100% -> 48%
14 Remove homeostatic PASS 14.45% ceiling saturation
15 Remove BCM PASS -58% encoding strength
16 Remove drift PASS Zero learning impact (pure observer)

Production + Recruitment

# Test Result Key metric
17 Production graph analysis PASS 8 critical discoveries including edge saturation
18 Neural recruitment ablation PASS 121% bridge discovery improvement (d = 19.8)

Overall: 13/18 PASS, 4/18 FAIL, 1 production analysis


Reproducing

Requires Python 3.12, NumPy, SciPy, Matplotlib.

cd experiments/
python experiment_01_stdp_kernel.py

Each script writes its own _data.json, _plot.png, and _report.md. Seeds are fixed at 42.


Stack

  • Python 3.12
  • NumPy / SciPy / Matplotlib
  • Soul Matrix: Rust (512x512 Hebbian conductance matrix, ~112us inference)
  • Capability Graph: Python (Hebbian + STDP + homeostatic scaling + BCM + recruitment)

Authors

CIPS Corp LLC -- February 2026

Contact: glass@cipscorps.io


License

MIT

About

Six Ways to Forget: Biologically-grounded forgetting mechanisms for LLM agent memory systems. 18 experiments, 4 falsified hypotheses, STDP ablation (Cohen's d = 3.163).

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages