-
NTTDATA
- JAPAN
- https://plus.google.com/+%E4%BA%8E%E7%B4%85%E7%B4%85
Highlights
Stars
Source code to djangoproject.com
The Web framework for perfectionists with deadlines.
Your self-hosted, globally interconnected microblogging community
Keras implementation of BERT with pre-trained weights
Use Google's BERT for named entity recognition (CoNLL-2003 as the dataset).
Multi-label Classification with BERT; Fine Grained Sentiment Analysis from AI challenger
Pre-training of Deep Bidirectional Transformers for Language Understanding: pre-train TextCNN
🏄 Scalable embedding, reasoning, ranking for images and sentences with CLIP
Chainer implementation of "BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding"
Google AI 2018 BERT pytorch implementation
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
TensorFlow code and pre-trained models for BERT
XLib in pure Python (Py2/Py3 compatible)
Public facing notes page
Make images smaller using best-in-class codecs, right in the browser.
freeCodeCamp.org's open-source codebase and curriculum. Learn to code for free.