Minimal reproduction of OneRec
-
Updated
Nov 5, 2025 - Python
Minimal reproduction of OneRec
A toolkit for scaling law research ⚖
Official code for the ICLR 2025 paper, "Scaling Offline Model-Based RL via Jointly-Optimized World-Action Model Pretraining"
qwen3-base family of models RL on gsm8k using verl, is there an RL power law on downstream tasks?
[NeurIPS 2023] Multi-fidelity hyperparameter optimization with deep power laws that achieves state-of-the-art results across diverse benchmarks.
[ICML 2023] "Data Efficient Neural Scaling Law via Model Reusing" by Peihao Wang, Rameswar Panda, Zhangyang Wang
A method for calculating scaling laws for LLMs from publicly available models
code for Scaling Laws for Language Transfer Learning
[ACL2025 Oral] Cuckoo: A Series of IE Free Riders Using LLM's Resources to Scale up Themselves.
AI-based scaling law discovery
Official implementation of "Optimal Sparsity of Mixture-of-Experts Language Models for Reasoning Tasks"
Optimization and Scaling of Medium-Frequency Transformers
🌹[ICML 2024] Selecting Large Language Model to Fine-tune via Rectified Scaling Law
Code for ICML 2025 How Do Large Language Monkeys Get Their Power (Laws)?
Do dense LMs develop MoE-like specialization as they scale? Measure it, visualize it, and turn it into speed.
RSRC Calculator is a practical tool designed to evaluate the efficiency of AI models in the post-scaling era: Recursive Self-Referential Compression (RSRC), this tool computes training efficiency metrics by analyzing factors such as training FLOPs, energy consumption, and model architecture details.
We study scaling law of large languge models observing there is an optimal model size given a computational budget
[NeurIPS 2023] Multi-fidelity hyperparameter optimization with deep power laws that achieves state-of-the-art results across diverse benchmarks.
Add a description, image, and links to the scaling-laws topic page so that developers can more easily learn about it.
To associate your repository with the scaling-laws topic, visit your repo's landing page and select "manage topics."