sosk@preferred.jp
ML, NLP, CV
-
Preferred Networks, Inc. @pfnet
- Tokyo, Japan
- https://soskek.github.io/
- @sosk_sosk
Pinned Loading
-
pfnet-research/distilled-feature-fields
pfnet-research/distilled-feature-fields Public -
-
bert-chainer
bert-chainer PublicChainer implementation of "BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding"
-
-
attention_is_all_you_need
attention_is_all_you_need PublicTransformer of "Attention Is All You Need" (Vaswani et al. 2017) by Chainer.
-
pfnet-research/contextual_augmentation
pfnet-research/contextual_augmentation PublicContextual augmentation, a text data augmentation using a bidirectional language model.
Something went wrong, please refresh the page to try again.
If the problem persists, check the GitHub status page or contact support.
If the problem persists, check the GitHub status page or contact support.


