Advanced quantization toolkit for LLMs and VLMs. Native support for WOQ, MXFP4, NVFP4, GGUF, Adaptive Bits and seamless integration with Transformers, vLLM, SGLang, and TorchAO
-
Updated
Dec 2, 2025 - Python
Advanced quantization toolkit for LLMs and VLMs. Native support for WOQ, MXFP4, NVFP4, GGUF, Adaptive Bits and seamless integration with Transformers, vLLM, SGLang, and TorchAO
ARCQuant: Boosting Fine-Grained Quantization with Augmented Residual Channels for LLMs
Add a description, image, and links to the nvfp4 topic page so that developers can more easily learn about it.
To associate your repository with the nvfp4 topic, visit your repo's landing page and select "manage topics."