dLLM training implementation on pure jax/flax (w/o pytorch) for Google TPUs(v4/v5e/v6e). #TPUSprint #TRC
-
Updated
Apr 25, 2026 - Python
dLLM training implementation on pure jax/flax (w/o pytorch) for Google TPUs(v4/v5e/v6e). #TPUSprint #TRC
Train an Earth Observation Segmentation Model (Unet) to identify Aquacultures in Landsat Data with Keras Kinetic and TPU infrastructure
Tutorial: mass-parallelized RAI compliance checks with vLLM batch inference on Cloud TPU v5e, using Gemma 4 as an LLM-as-a-Judge
Add a description, image, and links to the tpusprint topic page so that developers can more easily learn about it.
To associate your repository with the tpusprint topic, visit your repo's landing page and select "manage topics."