The ultimate collection of Wasm AI & LLM examples. Run Llama 3, Mistral, and Gemma locally or on the edge with WebAssembly. Official repo for WasmInference.com.
-
Updated
Jan 3, 2026
The ultimate collection of Wasm AI & LLM examples. Run Llama 3, Mistral, and Gemma locally or on the edge with WebAssembly. Official repo for WasmInference.com.
Make HTTP requests from inside WASM in wasmedge-quickjs . Devcontainer.
Add a description, image, and links to the wasmedge-quickjs topic page so that developers can more easily learn about it.
To associate your repository with the wasmedge-quickjs topic, visit your repo's landing page and select "manage topics."