Skip to content

Commit

Permalink
Update README.md
Browse files Browse the repository at this point in the history
  • Loading branch information
XueFuzhao authored Apr 21, 2022
1 parent e5ab4f2 commit f6a1694
Showing 1 changed file with 3 additions and 0 deletions.
3 changes: 3 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -29,6 +29,8 @@ This repo is a collection of AWESOME things about mixture-of-experts, including


**Arxiv**
- Residual Mixture of Experts [[20 Apr 2022]](https://arxiv.org/abs/2204.09636)
- On the Representation Collapse of Sparse Mixture of Experts [[20 Apr 2022]](https://arxiv.org/abs/2204.09179)
- StableMoE: Stable Routing Strategy for Mixture of Experts [[18 Apr 2022]](https://arxiv.org/abs/2204.08396)
- Sparsely Activated Mixture-of-Experts are Robust Multi-Task Learners [[16 Apr 2022]](https://arxiv.org/abs/2204.07689)
- MoEBERT: from BERT to Mixture-of-Experts via Importance-Guided Adaptation [[15 Apr 2022]](https://arxiv.org/abs/2204.07675)
Expand Down Expand Up @@ -76,6 +78,7 @@ This repo is a collection of AWESOME things about mixture-of-experts, including
## MoE Application

**Arxiv**
- Build a Robust QA System with Transformer-based Mixture of Experts [[20 Mar 2022]](https://arxiv.org/abs/2204.09598)
- Mixture of Experts for Biomedical Question Answering [[15 Apr 2022]](https://arxiv.org/abs/2204.07469)

# Library
Expand Down

0 comments on commit f6a1694

Please sign in to comment.