Installation | Introduction | Quick Links |
-
🔥 Latest Features
- 🤗 Hugging huggingface ecosystem, we use datasets lib as default dataset loader to support mounts of useful datasets.
- 📝 MindNLP supports NLP tasks such as language model, machine translation, question answering, sentiment analysis, sequence labeling, summarization, etc. You can access them through examples.
- 🚀 MindNLP currently supports industry-leading Large Language Models (LLMs), including Llama, GLM, RWKV, etc. For support related to large language models, including pre-training, fine-tuning, and inference demo examples, you can find them in the "llm" directory.
- 🤗 Pretrained models support huggingface transformers-like apis, including 28+ models like BERT, Roberta, GPT2, T5, etc.
You can use them easily by following code snippet:
from mindnlp.models import BertModel model = BertModel.from_pretrained('bert-base-cased')
Version Compatibility:
MindNLP version | MindSpore version | Supported Python version |
---|---|---|
master | daily build | >=3.7.5, <=3.9 |
0.1.1 | >=1.8.1, <=2.0.0 | >=3.7.5, <=3.9 |
0.2.0 | >=2.1.0 | >=3.7.5, <=3.9 |
You can download MindNLP daily wheel from here.
To install MindNLP from source, please run:
pip install git+https://github.com/mindspore-lab/mindnlp.git
# or
git clone https://github.com/mindspore-lab/mindnlp.git
cd mindnlp
bash scripts/build_and_reinstall.sh
MindNLP is an open source NLP library based on MindSpore. It supports a platform for solving natural language processing tasks, containing many common approaches in NLP. It can help researchers and developers to construct and train models more conveniently and rapidly.
The master branch works with MindSpore master.
- Comprehensive data processing: Several classical NLP datasets are packaged into friendly module for easy use, such as Multi30k, SQuAD, CoNLL, etc.
- Friendly NLP model toolset: MindNLP provides various configurable components. It is friendly to customize models using MindNLP.
- Easy-to-use engine: MindNLP simplified complicated training process in MindSpore. It supports Trainer and Evaluator interfaces to train and evaluate models easily.
The table below represents the current support in the library for each of those models, whether they have support in Pynative mode or Graph mode.
Model | Pynative support | Graph Support |
---|---|---|
ALBERT | ✅ | ✅ |
Autoformer | ✅ (Inference only) | ❌ |
BaiChuan | ✅ | ❌ |
Bark | ✅ | ❌ |
BART | ✅ | ❌ |
BERT | ✅ | ✅ |
BLIP | TODO | ✅ |
BLIP2 | TODO | ✅ |
BLOOM | ✅ | ❌ |
ChatGLM | ✅ | ❌ |
ChatGLM2 | ✅ | ❌ |
ChatGLM3 | ✅ | ❌ |
CLIP | ✅ | ❌ |
CodeGen | ✅ | ❌ |
ConvBERT | TODO | ❌ |
CPM | ✅ | ❌ |
CPM-Ant | ✅ | ❌ |
CPM-Bee | ✅ | ❌ |
EnCodec | ✅ | ❌ |
ERNIE | ✅ | ✅ |
ERNIEM | ✅ | ✅ |
Falcon | ✅ | ❌ |
GLM | ✅ | ❌ |
OpenAI GPT | ✅ | ❌ |
OpenAI GPT-2 | ✅ | ✅ |
GPT Neo | ✅ | ❌ |
GPT NeoX | TODO | ❌ |
GPTBigCode | ✅ | ❌ |
Graphormer | ✅ | ❌ |
Llama | ✅ | ❌ |
Llama2 | ✅ | ❌ |
CodeLlama | ✅ | ❌ |
Longformer | ✅ | ❌ |
LongT5 | ✅ | ❌ |
LUKE | ✅ | ❌ |
MaskFormer | ✅ | ❌ |
mBART-50 | ✅ | ❌ |
Megatron-BERT | ✅ | ❌ |
Megatron-GPT2 | ✅ | ❌ |
MobileBERT | ✅ | ❌ |
Moss | ✅ | ❌ |
Nezha | ✅ | ❌ |
OPT | ✅ | ❌ |
Pangu | ✅ | ❌ |
Pop2piano | Todo | ❌ |
RoBERTa | ✅ | ✅ |
RWKV | ✅ | ❌ |
SeamlessM4T | ✅ | ❌ |
SeamlessM4Tv2 | ✅ | ❌ |
T5 | ✅ | ❌ |
TimeSformer | TODO | ❌ |
Tinybert | ✅ | ❌ |
Whisper | ✅ | ❌ |
XLM | ✅ | ❌ |
XLM-RoBERTa | ✅ | ❌ |
This project is released under the Apache 2.0 license.
The dynamic version is still under development, if you find any issue or have an idea on new features, please don't hesitate to contact us via Github Issues.
MindSpore is an open source project that welcome any contribution and feedback.
We wish that the toolbox and benchmark could serve the growing research
community by providing a flexible as well as standardized toolkit to reimplement existing methods
and develop their own new semantic segmentation methods.
If you find this project useful in your research, please consider citing:
@misc{mindnlp2022,
title={{MindNLP}: a MindSpore NLP library},
author={MindNLP Contributors},
howpublished = {\url{https://github.com/mindlab-ai/mindnlp}},
year={2022}
}