Stars
📰 Must-read papers and blogs on LLM based Long Context Modeling 🔥
Chinese-LLaMA 1&2、Chinese-Falcon 基础模型;ChatFlow中文对话模型;中文OpenLLaMA模型;NLP预训练/指令微调数据集
OpenLLaMA, a permissively licensed open source reproduction of Meta AI’s LLaMA 7B trained on the RedPajama dataset
[EMNLP 2021] SimCSE: Simple Contrastive Learning of Sentence Embeddings https://arxiv.org/abs/2104.08821
[NeurIPS 2021] WRENCH: Weak supeRvision bENCHmark
TorchKGE: Knowledge Graph embedding in Python and PyTorch.
a pytorch lib with state-of-the-art architectures, pretrained models and real-time updated results
Multi-Task Deep Neural Networks for Natural Language Understanding
ACL 2020: A Re-evaluation of Knowledge Graph Completion Methods
Must-read papers on graph neural networks (GNN)
The new Windows Terminal and the original Windows console host, all in the same place!
An open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model compression and hyper-parameter tuning.
Ongoing research training transformer models at scale
Resource scheduling and cluster management for AI
pytorch implementation of Attention is all you need
Transformer of "Attention Is All You Need" (Vaswani et al. 2017) by Chainer.
Embedded and mobile deep learning research resources
Elastic Deep Learning for deep learning framework on Kubernetes
Macro Continuous Evaluation Platform for Paddle.
Open Source Neural Machine Translation and (Large) Language Models in PyTorch
Deep Learning Visualization Toolkit(『飞桨』深度学习可视化工具 )
A TensorFlow Implementation of the Transformer: Attention Is All You Need
A PyTorch implementation of the Transformer model in "Attention is All You Need".
Library of deep learning models and datasets designed to make deep learning more accessible and accelerate ML research.