TPU에서 한국어용 LLM 추론을 위한 Jax/Flax 구현체입니다.
-
Updated
Jun 12, 2023 - Python
TPU에서 한국어용 LLM 추론을 위한 Jax/Flax 구현체입니다.
Example code for prefix-tuning GPT/GPT-NeoX models and for inference with trained prefixes
Megatron-LM/GPT-NeoX compatible Text Encoder with 🤗Transformers AutoTokenizer.
Efficiently query multiple prompts with ease: a command-line tool for batch querying large language models.
Using GPT-3 with gpt-neox model generate fully discord bot to do variolous NLP task.
experimental python app that uses the gpt-neox-20b model to generate Bash code from requests given in natural language
Add a description, image, and links to the gpt-neox topic page so that developers can more easily learn about it.
To associate your repository with the gpt-neox topic, visit your repo's landing page and select "manage topics."