Skip to content

Commit de75f90

Browse files
authored
Update library docs: adapter-transformers -> adapters (huggingface#1184)
* Update library docs: `adapter-transformers` -> `adapters` * Update ToC and overview page. * Add redirect to "adapters" * Update adapters.md
1 parent c57c1b7 commit de75f90

File tree

5 files changed

+91
-70
lines changed

5 files changed

+91
-70
lines changed

docs/hub/_redirects.yml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -15,3 +15,4 @@ how-to-inference: /docs/huggingface_hub/how-to-inference
1515
searching-the-hub: /docs/huggingface_hub/searching-the-hub
1616
# end of first_section
1717
api-webhook: webhooks
18+
adapter-transformers: adapters

docs/hub/_toctree.yml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -57,8 +57,8 @@
5757
- local: models-libraries
5858
title: Integrated Libraries
5959
sections:
60-
- local: adapter-transformers
61-
title: Adapter Transformers
60+
- local: adapters
61+
title: Adapters
6262
- local: allennlp
6363
title: AllenNLP
6464
- local: bertopic

docs/hub/adapter-transformers.md

Lines changed: 0 additions & 67 deletions
This file was deleted.

docs/hub/adapters.md

Lines changed: 87 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,87 @@
1+
# Using _Adapters_ at Hugging Face
2+
3+
> Note: _Adapters_ has replaced the `adapter-transformers` library and is fully compatible in terms of model weights. See [here](https://docs.adapterhub.ml/transitioning.html) for more.
4+
5+
[_Adapters_](https://github.com/adapter-hub/adapters) is an add-on library to 🤗 `transformers` for efficiently fine-tuning pre-trained language models using adapters and other parameter-efficient methods.
6+
_Adapters_ also provides various methods for composition of adapter modules during training and inference.
7+
You can learn more about this in the [_Adapters_ paper](https://arxiv.org/abs/2311.11077).
8+
9+
## Exploring _Adapters_ on the Hub
10+
11+
You can find _Adapters_ models by filtering at the left of the [models page](https://huggingface.co/models?library=adapter-transformers&sort=downloads). Some adapter models can be found in the Adapter Hub [repository](https://github.com/adapter-hub/hub). Models from both sources are aggregated on the [AdapterHub website](https://adapterhub.ml/explore/).
12+
13+
## Installation
14+
15+
To get started, you can refer to the [AdapterHub installation guide](https://docs.adapterhub.ml/installation.html). You can also use the following one-line install through pip:
16+
17+
```
18+
pip install adapters
19+
```
20+
21+
## Using existing models
22+
23+
For a full guide on loading pre-trained adapters, we recommend checking out the [official guide](https://docs.adapterhub.ml/loading.html).
24+
25+
As a brief summary, a full setup consists of three steps:
26+
27+
1. Load a base `transformers` model with the `AutoAdapterModel` class provided by _Adapters_.
28+
2. Use the `load_adapter()` method to load and add an adapter.
29+
3. Activate the adapter via `active_adapters` (for inference) or activate and set it as trainable via `train_adapter()` (for training). Make sure to also check out [composition of adapters](https://docs.adapterhub.ml/adapter_composition.html).
30+
31+
```py
32+
from adapters import AutoAdapterModel
33+
34+
# 1.
35+
model = AutoAdapterModel.from_pretrained("roberta-base")
36+
# 2.
37+
adapter_name = model.load_adapter("AdapterHub/roberta-base-pf-imdb")
38+
# 3.
39+
model.active_adapters = adapter_name
40+
# or model.train_adapter(adapter_name)
41+
```
42+
43+
You can also use `list_adapters` to find all Adapter Models programmatically:
44+
45+
```py
46+
from adapters import list_adapters
47+
48+
# source can be "ah" (AdapterHub), "hf" (hf.co) or None (for both, default)
49+
adapter_infos = list_adapters(source="hf", model_name="roberta-base")
50+
```
51+
52+
If you want to see how to load a specific model, you can click `Use in Adapter Transformers` and you will be given a working snippet that you can load it!
53+
54+
<div class="flex justify-center">
55+
<img class="block dark:hidden" src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/hub/libraries-adapter_transformers_snippet1.png"/>
56+
<img class="hidden dark:block" src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/hub/libraries-adapter_transformers-snippet1-dark.png"/>
57+
</div>
58+
59+
<div class="flex justify-center">
60+
<img class="block dark:hidden" src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/hub/libraries-adapter_transformers_snippet2.png"/>
61+
<img class="hidden dark:block" src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/hub/libraries-adapter_transformers-snippet2-dark.png"/>
62+
</div>
63+
64+
## Sharing your models
65+
66+
For a full guide on sharing models with _Adapters_, we recommend checking out the [official guide](https://docs.adapterhub.ml/huggingface_hub.html#uploading-to-the-hub).
67+
68+
You can share your adapter by using the `push_adapter_to_hub` method from a model that already contains an adapter.
69+
70+
```py
71+
model.push_adapter_to_hub(
72+
"my-awesome-adapter",
73+
"awesome_adapter",
74+
adapterhub_tag="sentiment/imdb",
75+
datasets_tag="imdb"
76+
)
77+
```
78+
79+
This command creates a repository with an automatically generated model card and all necessary metadata.
80+
81+
82+
## Additional resources
83+
84+
* _Adapters_ [repository](https://github.com/adapter-hub/adapters)
85+
* _Adapters_ [docs](https://docs.adapterhub.ml)
86+
* _Adapters_ [paper](https://arxiv.org/abs/2311.11077)
87+
* Integration with Hub [docs](https://docs.adapterhub.ml/huggingface_hub.html)

docs/hub/models-libraries.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ The table below summarizes the supported libraries and their level of integratio
66

77
| Library | Description | Inference API | Widgets | Download from Hub | Push to Hub |
88
|-----------------------------------------------------------------------------|--------------------------------------------------------------------------------------|---|---:|---|---|
9-
| [Adapter Transformers](./adapter-transformers) | Extends 🤗Transformers with Adapters. |||||
9+
| [Adapters](https://github.com/adapter-hub/adapters) | A unified Transformers add-on for parameter-efficient and modular fine-tuning. |||||
1010
| [AllenNLP](./allennlp) | An open-source NLP research library, built on PyTorch. |||||
1111
| [Asteroid](./asteroid) | PyTorch-based audio source separation toolkit |||||
1212
| [BERTopic](./bertopic) | BERTopic is a topic modeling library for text and images |||||

0 commit comments

Comments
 (0)