Skip to content

Commit fceeb38

Browse files
committed
draft
1 parent 6a9b384 commit fceeb38

File tree

12 files changed

+25
-2386
lines changed

12 files changed

+25
-2386
lines changed

docs/source/ar/notebooks.md

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -39,7 +39,6 @@
3939
| [كيفية ضبط نموذج بدقة على التلخيص](https://github.com/huggingface/notebooks/blob/main/examples/summarization.ipynb)| يوضح كيفية معالجة البيانات مسبقًا وضبط نموذج مُدرَّب مسبقًا بدقة على XSUM. | [![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/huggingface/notebooks/blob/main/examples/summarization.ipynb)| [![Open in AWS Studio](https://studiolab.sagemaker.aws/studiolab.svg)](https://studiolab.sagemaker.aws/import/github/huggingface/notebooks/blob/main/examples/summarization.ipynb)|
4040
| [كيفية تدريب نموذج لغة من البداية](https://github.com/huggingface/blog/blob/main/notebooks/01_how_to_train.ipynb)| تسليط الضوء على جميع الخطوات لتدريب نموذج Transformer بشكل فعال على بيانات مخصصة | [![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/huggingface/blog/blob/main/notebooks/01_how_to_train.ipynb)| [![Open in AWS Studio](https://studiolab.sagemaker.aws/studiolab.svg)](https://studiolab.sagemaker.aws/import/github/huggingface/blog/blob/main/notebooks/01_how_to_train.ipynb)|
4141
| [كيفية إنشاء نص](https://github.com/huggingface/blog/blob/main/notebooks/02_how_to_generate.ipynb)| كيفية استخدام أساليب فك التشفير المختلفة لإنشاء اللغة باستخدام المحولات | [![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/huggingface/blog/blob/main/notebooks/02_how_to_generate.ipynb)| [![Open in AWS Studio](https://studiolab.sagemaker.aws/studiolab.svg)](https://studiolab.sagemaker.aws/import/github/huggingface/blog/blob/main/notebooks/02_how_to_generate.ipynb)|
42-
| [كيفية إنشاء نص (مع قيود)](https://github.com/huggingface/blog/blob/main/notebooks/53_constrained_beam_search.ipynb)| كيفية توجيه إنشاء اللغة باستخدام القيود التي يوفرها المستخدم | [![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/huggingface/blog/blob/main/notebooks/53_constrained_beam_search.ipynb)| [![Open in AWS Studio](https://studiolab.sagemaker.aws/studiolab.svg)](https://studiolab.sagemaker.aws/import/github/huggingface/blog/blob/main/notebooks/53_constrained_beam_search.ipynb)|
4342
| [Reformer](https://github.com/huggingface/blog/blob/main/notebooks/03_reformer.ipynb)| كيف يدفع Reformer حدود النمذجة اللغوية | [![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/patrickvonplaten/blog/blob/main/notebooks/03_reformer.ipynb)| [![Open in AWS Studio](https://studiolab.sagemaker.aws/studiolab.svg)](https://studiolab.sagemaker.aws/import/github/patrickvonplaten/blog/blob/main/notebooks/03_reformer.ipynb)|
4443

4544
#### رؤية الكمبيوتر[[pytorch-cv]]

docs/source/en/internal/generation_utils.md

Lines changed: 0 additions & 26 deletions
Original file line numberDiff line numberDiff line change
@@ -201,32 +201,6 @@ A [`StoppingCriteria`] can be used to change when to stop generation (other than
201201
[[autodoc]] EosTokenCriteria
202202
- __call__
203203

204-
## Constraints
205-
206-
A [`Constraint`] can be used to force the generation to include specific tokens or sequences in the output. Please note that this is exclusively available to our PyTorch implementations.
207-
208-
[[autodoc]] Constraint
209-
210-
[[autodoc]] PhrasalConstraint
211-
212-
[[autodoc]] DisjunctiveConstraint
213-
214-
[[autodoc]] ConstraintListState
215-
216-
## BeamSearch
217-
218-
[[autodoc]] BeamScorer
219-
- process
220-
- finalize
221-
222-
[[autodoc]] BeamSearchScorer
223-
- process
224-
- finalize
225-
226-
[[autodoc]] ConstrainedBeamSearchScorer
227-
- process
228-
- finalize
229-
230204
## Streamers
231205

232206
[[autodoc]] TextStreamer

docs/source/ja/internal/generation_utils.md

Lines changed: 0 additions & 26 deletions
Original file line numberDiff line numberDiff line change
@@ -303,32 +303,6 @@ generation_output[:2]
303303
[[autodoc]] MaxTimeCriteria
304304
- __call__
305305

306-
## Constraints
307-
308-
[`Constraint`] を使用すると、生成時に出力に特定のトークンまたはシーケンスが含まれるように強制できます。これは PyTorch 実装でのみ利用可能であることに注意してください。
309-
310-
[[autodoc]] Constraint
311-
312-
[[autodoc]] PhrasalConstraint
313-
314-
[[autodoc]] DisjunctiveConstraint
315-
316-
[[autodoc]] ConstraintListState
317-
318-
## BeamSearch
319-
320-
[[autodoc]] BeamScorer
321-
- process
322-
- finalize
323-
324-
[[autodoc]] BeamSearchScorer
325-
- process
326-
- finalize
327-
328-
[[autodoc]] ConstrainedBeamSearchScorer
329-
- process
330-
- finalize
331-
332306
## Streamers
333307

334308
[[autodoc]] TextStreamer

docs/source/ko/internal/generation_utils.md

Lines changed: 0 additions & 26 deletions
Original file line numberDiff line numberDiff line change
@@ -308,32 +308,6 @@ generation_output[:2]
308308
[[autodoc]] EosTokenCriteria
309309
- __call__
310310

311-
## Constraint [[transformers.Constraint]]
312-
313-
[`Constraint`]는 생성 출력에 특정 토큰이나 시퀀스를 강제로 포함시키는 데 사용됩니다. 이 기능은 PyTorch 구현에만 제공됩니다.
314-
315-
[[autodoc]] Constraint
316-
317-
[[autodoc]] PhrasalConstraint
318-
319-
[[autodoc]] DisjunctiveConstraint
320-
321-
[[autodoc]] ConstraintListState
322-
323-
## 빔 검색 (BeamSearch) [[transformers.BeamScorer]]
324-
325-
[[autodoc]] BeamScorer
326-
- process
327-
- finalize
328-
329-
[[autodoc]] BeamSearchScorer
330-
- process
331-
- finalize
332-
333-
[[autodoc]] ConstrainedBeamSearchScorer
334-
- process
335-
- finalize
336-
337311
## 스트리머 (Streamers) [[transformers.TextStreamer]]
338312

339313
[[autodoc]] TextStreamer

docs/source/zh/internal/generation_utils.md

Lines changed: 0 additions & 26 deletions
Original file line numberDiff line numberDiff line change
@@ -298,32 +298,6 @@ generation_output[:2]
298298
[[autodoc]] MaxTimeCriteria
299299
- __call__
300300

301-
## Constraints
302-
303-
可以使用[`Constraint`]来强制生成结果包含输出中的特定tokens或序列。请注意,这仅适用于我们的PyTorch实现。
304-
305-
[[autodoc]] Constraint
306-
307-
[[autodoc]] PhrasalConstraint
308-
309-
[[autodoc]] DisjunctiveConstraint
310-
311-
[[autodoc]] ConstraintListState
312-
313-
## BeamSearch
314-
315-
[[autodoc]] BeamScorer
316-
- process
317-
- finalize
318-
319-
[[autodoc]] BeamSearchScorer
320-
- process
321-
- finalize
322-
323-
[[autodoc]] ConstrainedBeamSearchScorer
324-
- process
325-
- finalize
326-
327301
## Streamers
328302

329303
[[autodoc]] TextStreamer

notebooks/README.md

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -56,7 +56,6 @@ You can open any page of the documentation as a notebook in Colab (there is a bu
5656
| [How to fine-tune a model on summarization](https://github.com/huggingface/notebooks/blob/main/examples/summarization.ipynb)| Show how to preprocess the data and fine-tune a pretrained model on XSUM. | [![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/huggingface/notebooks/blob/main/examples/summarization.ipynb)| [![Open in AWS Studio](https://studiolab.sagemaker.aws/studiolab.svg)](https://studiolab.sagemaker.aws/import/github/huggingface/notebooks/blob/main/examples/summarization.ipynb)|
5757
| [How to train a language model from scratch](https://github.com/huggingface/blog/blob/main/notebooks/01_how_to_train.ipynb)| Highlight all the steps to effectively train Transformer model on custom data | [![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/huggingface/blog/blob/main/notebooks/01_how_to_train.ipynb)| [![Open in AWS Studio](https://studiolab.sagemaker.aws/studiolab.svg)](https://studiolab.sagemaker.aws/import/github/huggingface/blog/blob/main/notebooks/01_how_to_train.ipynb)|
5858
| [How to generate text](https://github.com/huggingface/blog/blob/main/notebooks/02_how_to_generate.ipynb)| How to use different decoding methods for language generation with transformers | [![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/huggingface/blog/blob/main/notebooks/02_how_to_generate.ipynb)| [![Open in AWS Studio](https://studiolab.sagemaker.aws/studiolab.svg)](https://studiolab.sagemaker.aws/import/github/huggingface/blog/blob/main/notebooks/02_how_to_generate.ipynb)|
59-
| [How to generate text (with constraints)](https://github.com/huggingface/blog/blob/main/notebooks/53_constrained_beam_search.ipynb)| How to guide language generation with user-provided constraints | [![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/huggingface/blog/blob/main/notebooks/53_constrained_beam_search.ipynb)| [![Open in AWS Studio](https://studiolab.sagemaker.aws/studiolab.svg)](https://studiolab.sagemaker.aws/import/github/huggingface/blog/blob/main/notebooks/53_constrained_beam_search.ipynb)|
6059
| [Reformer](https://github.com/huggingface/blog/blob/main/notebooks/03_reformer.ipynb)| How Reformer pushes the limits of language modeling | [![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/patrickvonplaten/blog/blob/main/notebooks/03_reformer.ipynb)| [![Open in AWS Studio](https://studiolab.sagemaker.aws/studiolab.svg)](https://studiolab.sagemaker.aws/import/github/patrickvonplaten/blog/blob/main/notebooks/03_reformer.ipynb)|
6160

6261
#### Computer Vision[[pytorch-cv]]

src/transformers/__init__.py

Lines changed: 0 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -409,13 +409,7 @@
409409
"AlternatingCodebooksLogitsProcessor",
410410
"BayesianDetectorConfig",
411411
"BayesianDetectorModel",
412-
"BeamScorer",
413-
"BeamSearchScorer",
414412
"ClassifierFreeGuidanceLogitsProcessor",
415-
"ConstrainedBeamSearchScorer",
416-
"Constraint",
417-
"ConstraintListState",
418-
"DisjunctiveConstraint",
419413
"EncoderNoRepeatNGramLogitsProcessor",
420414
"EncoderRepetitionPenaltyLogitsProcessor",
421415
"EosTokenCriteria",
@@ -654,14 +648,8 @@
654648
from .generation import AsyncTextIteratorStreamer as AsyncTextIteratorStreamer
655649
from .generation import BayesianDetectorConfig as BayesianDetectorConfig
656650
from .generation import BayesianDetectorModel as BayesianDetectorModel
657-
from .generation import BeamScorer as BeamScorer
658-
from .generation import BeamSearchScorer as BeamSearchScorer
659651
from .generation import ClassifierFreeGuidanceLogitsProcessor as ClassifierFreeGuidanceLogitsProcessor
660652
from .generation import CompileConfig as CompileConfig
661-
from .generation import ConstrainedBeamSearchScorer as ConstrainedBeamSearchScorer
662-
from .generation import Constraint as Constraint
663-
from .generation import ConstraintListState as ConstraintListState
664-
from .generation import DisjunctiveConstraint as DisjunctiveConstraint
665653
from .generation import EncoderNoRepeatNGramLogitsProcessor as EncoderNoRepeatNGramLogitsProcessor
666654
from .generation import EncoderRepetitionPenaltyLogitsProcessor as EncoderRepetitionPenaltyLogitsProcessor
667655
from .generation import EosTokenCriteria as EosTokenCriteria

src/transformers/generation/__init__.py

Lines changed: 0 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -35,18 +35,6 @@
3535
except OptionalDependencyNotAvailable:
3636
pass
3737
else:
38-
_import_structure["beam_constraints"] = [
39-
"Constraint",
40-
"ConstraintListState",
41-
"DisjunctiveConstraint",
42-
"PhrasalConstraint",
43-
]
44-
_import_structure["beam_search"] = [
45-
"BeamHypotheses",
46-
"BeamScorer",
47-
"BeamSearchScorer",
48-
"ConstrainedBeamSearchScorer",
49-
]
5038
_import_structure["candidate_generator"] = [
5139
"AssistedCandidateGenerator",
5240
"CandidateGenerator",
@@ -208,8 +196,6 @@
208196
except OptionalDependencyNotAvailable:
209197
pass
210198
else:
211-
from .beam_constraints import Constraint, ConstraintListState, DisjunctiveConstraint, PhrasalConstraint
212-
from .beam_search import BeamHypotheses, BeamScorer, BeamSearchScorer, ConstrainedBeamSearchScorer
213199
from .candidate_generator import (
214200
AssistedCandidateGenerator,
215201
CandidateGenerator,

0 commit comments

Comments
 (0)