Skip to content

eightnight2049/VPC

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

VPC: Let OOD Features Explore Vast Predefined Classifiers

🔰 Overview

A unified, extensible OOD detection playground built around auxiliary Outlier Exposure (OE). In one repo, you can mix and match multiple methods, backbones, score functions, and training paradigms (two-stage & one-stage), with ready-to-run scripts for pretraining ID models (various base losses) and joint ID+OE training. This makes it fast to reproduce, compare, and understand modern OOD techniques end-to-end.

📌 Abstract

Real-world OOD data are broad and non-stationary, making ID-only training brittle. VPC equips a model with a pre-specified Orthogonal Equiangular Feature Space (OEFS) that allocates two complementary subspaces: EBVs for ID classes and a large bank of VEBVs for OOD. With evidential priors, ENC aligns ID features to their class EBVs, preserving accuracy and calibrated evidence. A VEBV loss simultaneously drives OE features to explore the VEBV subspace, yielding rich OOD structure and near-orthogonal separation. We introduce the VPC Score—the ℓ₂ activation magnitude over predefined classifiers—as a simple, class-agnostic OOD signal. On CIFAR-10/100 across standard backbones, VPC delivers leading FPR@95/AUROC/AUPR, demonstrating the effectiveness of Vast Predefined Classifiers for OOD detection.


📊 Visualization

We visualize EBV/VEBV activations in the OEFS under multiple scoring functions, and include representative failure cases to contrast behaviors on hard samples. See the paper for more visualizations and analysis. Swipe left to right; click an image to view the full-size original in a new tab.

VPC Visual 01 Fig 1 · CIFAR-10 — EBV/VEBV VPC Visual 02 Fig 2 · CIFAR-100 — EBV/VEBV VPC Visual 03 Fig 3 · Near-OOD case VPC Visual 04 Fig 4 · Near-OOD case VPC Visual 05 Fig 5 · Near-OOD case VPC Visual 06 Fig 6 · Near-OOD case VPC Visual 07 Fig 7 · Near-OOD case

📑 Results

Method CIFAR-10 CIFAR-100
FPR95 ↓ AUROC ↑ FPR95 ↓ AUROC ↑
OE 3.44 99.05 36.14 92.76
Energy-OE 3.75 98.66 40.34 91.69
DAL 3.17 98.84 32.89 93.21
PFS 2.68 98.66 34.35 93.33
Ours 2.27 99.18 32.04 93.65

⚙️ Installation & Environment

  • Python 3.7.13 (3.8/3.9 also fine if your setup supports them)
  • PyTorch 1.13.1, torchvision 0.14.1
  • CUDA, NumPy
# (recommended) create a fresh env
conda create -n vpc python=3.9 -y
conda activate vpc

# install torch/vision matching your CUDA
pip install torch==1.13.1 torchvision==0.14.1  # add +cuXXX wheels if needed
pip install numpy tqdm matplotlib scipy

Note: For strict reproducibility, keep Python/PyTorch versions and random seeds consistent with your logs/paper.


📦 Datasets

(i) ID datasets

  • CIFAR-10 / CIFAR-100 (auto-download if missing)

(ii) Auxiliary OE OOD

(iii) OOD data

Note: For all OOD and OE datasets, consolidate the images into a single directory (i.e., flatten the folder structure).


🔧 EBVs / VEBVs Preparation

Open oefs/EBV.ipynb and generate EBVs / VEBVs (ETF). Configure the paths correctly in the _ENC model and in main.py. Optional parameters include the number of EBVs, their dimensionality, We recommend following the settings in the paper.

Note: NC/ENC methods require --num_VEBVs. Larger values offer a “wider” OOD subspace but increase memory/compute cost.


🚀 Training

We support Two-Stage and One-Stage paradigms.

(i) Two-Stage Training

① Pre-train (ID only) This step builds the ID-only pretraining model for a staged pipeline and supports multiple backbones and base losses.

# CE
python main.py cifar10 --learning_rate 0.1 --epochs 200 \
  --train_mode pre_train --model wrnet --method ce 

# NC (requires VEBVs)
python main.py cifar10 --learning_rate 0.1 --epochs 200 \
  --train_mode pre_train --model wrnet --method nc --num_VEBVs 1000 

# ENC (requires VEBVs)
python main.py cifar10 --learning_rate 0.1 --epochs 200 \
  --train_mode pre_train --model wrnet --method enc --num_VEBVs 1000 

Note: Both cifar10/cifar100 and wrnet/resnet/densenet are supported. NC/ENC require --num_VEBVs (recommended 500/1000/2000).

② Post-train (Joint ID + OE) This step performs joint ID+OE fine-tuning for staged training and supports multiple backbones and base losses.

# OE
python main.py cifar10 --learning_rate 0.07 --epochs 50 \
  --train_mode post_train --load_pretrain true --model wrnet \
  --method oe --score_type msp

# Energy-OE
python main.py cifar10 --learning_rate 0.07 --epochs 50 \
  --train_mode post_train --load_pretrain true --model wrnet \
  --method energy-oe --score_type energy

# DAL
python main.py cifar10 --learning_rate 0.07 --epochs 50 \
  --train_mode post_train --load_pretrain true --model wrnet \
  --method dal --score_type msp

# PFS
python main.py cifar10 --learning_rate 0.07 --epochs 50 \
  --train_mode post_train --load_pretrain true --model wrnet \
  --method pfs --score_type pfs

# NC (requires VEBVs)
python main.py cifar10 --learning_rate 0.07 --epochs 50 \
  --train_mode post_train --load_pretrain true --model wrnet \
  --method nc --num_VEBVs 1000 --score_type nc_score

# VPC / ENC (requires VEBVs)
python main.py cifar10 --learning_rate 0.07 --epochs 50 \
  --train_mode post_train --load_pretrain true --model wrnet \
  --method enc --num_VEBVs 1000 --score_type enc_score

Important: Configure the pretrained ID model path in main.py; otherwise, training will start from scratch.

(ii) One-Stage Training (Joint ID + OE)

This step trains on ID+OE jointly in a single stage and supports multiple backbones and base losses.

# OE
python main.py cifar10 --learning_rate 0.1 --epochs 150 \
  --train_mode post_train --model wrnet --method oe --score_type msp

# Energy-OE
python main.py cifar10 --learning_rate 0.1 --epochs 150 \
  --train_mode post_train --model wrnet --method energy-oe --score_type energy

# DAL
python main.py cifar10 --learning_rate 0.1 --epochs 150 \
  --train_mode post_train --model wrnet --method dal --score_type msp

# PFS
python main.py cifar10 --learning_rate 0.1 --epochs 150 \
  --train_mode post_train --model wrnet --method pfs --score_type pfs

# NC (requires VEBVs)
python main.py cifar10 --learning_rate 0.1 --epochs 150 \
  --train_mode post_train --model wrnet --method nc --num_VEBVs 1000 --score_type nc_score

# VPC / ENC (requires VEBVs)
python main.py cifar10 --learning_rate 0.1 --epochs 150 \
  --train_mode post_train --model wrnet --method enc --num_VEBVs 1000 --score_type enc_score

🧪 Evaluation

# VPC / ENC score
python test.py cifar10 --model wrnet --num_VEBVs 1000 \
  --score_type enc_score --model_path path/to/ckpt.pt

# Other methods (example: ℓ2 logits)
python test.py cifar10 --model wrnet \
  --score_type l2_logits --model_path path/to/ckpt.pt

Common flags:

  • --visual true to enable visualization
  • --near_ood to enable near-OOD testing
  • --num_VEBVs for NC/ENC (recommended 500/1000/2000)

🔢 Score functions

Key Description
enc_score VPC Score
msp Maximum Softmax Probability
energy Energy score
uncertainty Evidential Uncertainty
edl_prob Evidential probability
pfs PFS score
l2_logits ℓ2 norm of logits/activations
...

Note: Keys must match the implementations in test.py. If your code uses different names, adjust accordingly.


🙏 Acknowledgements

VPC builds on prior excellent codebases (e.g., Outlier Exposure (OE), Energy-based OOD, DAL: Learning to Augment Distributions for OOD Detection, PFS: Pursuing Feature Separation). If you find VPC helpful, please also consider citing those works.


About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors