Skip to content

Commit 28f83e4

Browse files
authored
Update README.md
Reorganize
1 parent a11bf4c commit 28f83e4

File tree

1 file changed

+8
-9
lines changed

1 file changed

+8
-9
lines changed

README.md

Lines changed: 8 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -5,9 +5,14 @@
55
<img src="https://camo.githubusercontent.com/64f8905651212a80869afbecbf0a9c52a5d1e70beab750dea40a994fa9a9f3c6/68747470733a2f2f617765736f6d652e72652f62616467652e737667" alt="Awesome" data-canonical-src="https://awesome.re/badge.svg" style="max-width: 100%;">
66
</p>
77

8-
A curated (still actively updated) list of practical guide resources of LLMs. It's based on our survey paper: [Harnessing the Power of LLMs in Practice: A Survey on ChatGPT and Beyond](https://arxiv.org/abs/2304.13712). The survey is partially based on the second half of this [Blog](https://jingfengyang.github.io/gpt).
8+
A curated (still actively updated) list of practical guide resources of LLMs. It's based on our survey paper: [Harnessing the Power of LLMs in Practice: A Survey on ChatGPT and Beyond](https://arxiv.org/abs/2304.13712). The survey is partially based on the second half of this [Blog](https://jingfengyang.github.io/gpt).We also build an evolutionary tree of modern Large Language Models (LLMs) to trace the development of language models in recent years and highlights some of the most well-known models.
9+
10+
These sources aim to help practitioners navigate the vast landscape of large language models (LLMs) and their applications in natural language processing (NLP) applications. If you find any resources in our repository helpful, please feel free to use them (don't forget to cite our paper! 😃). We welcome pull requests to refine this figure!
11+
12+
<p align="center">
13+
<img width="600" src="./imgs/models-colorgrey.jpg"/>
14+
</p>
915

10-
These sources aim to help practitioners navigate the vast landscape of large language models (LLMs) and their applications in natural language processing (NLP) applications. If you find any resources in our repository helpful, please feel free to use them (don't forget to cite our paper! 😃)
1116

1217
```bibtex
1318
@article{yang2023harnessing,
@@ -25,7 +30,7 @@ These sources aim to help practitioners navigate the vast landscape of large lan
2530
- We released the source file for the still version [pptx](./source/figure_still.pptx), and replaced the figure in this repo with the still version. [4/29/2023]
2631
- Add AlexaTM, UniLM, UniLMv2 to the figure, and correct the logo for Tk. [4/29/2023]
2732

28-
We welcome pull requests to refine this figure, and if you find the source helpful, please cite our paper.
33+
2934

3035

3136
## Other Practical Guides for LLMs
@@ -62,12 +67,6 @@ We welcome pull requests to refine this figure, and if you find the source helpf
6267

6368
## Practical Guide for Models
6469

65-
We build an evolutionary tree of modern Large Language Models (LLMs) to trace the development of language models in recent years and highlights some of the most well-known models, in the following figure:
66-
67-
<p align="center">
68-
<img width="600" src="./imgs/models-colorgrey.jpg"/>
69-
</p>
70-
7170
### BERT-style Language Models: Encoder-Decoder or Encoder-only
7271

7372
- BERT **BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding**, 2018, [Paper](https://aclanthology.org/N19-1423.pdf)

0 commit comments

Comments
 (0)