-
Notifications
You must be signed in to change notification settings - Fork 26.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add support for args to ProcessorMixin for backward compatibility #33479
Changes from 7 commits
3412934
be73449
52744e9
70799e4
2b5eaa0
e678a77
5a30a6e
6a9bab2
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -38,7 +38,9 @@ | |
|
||
from .tokenization_utils_base import ( | ||
PaddingStrategy, | ||
PreTokenizedInput, | ||
PreTrainedTokenizerBase, | ||
TextInput, | ||
TruncationStrategy, | ||
) | ||
from .utils import ( | ||
|
@@ -114,6 +116,9 @@ class TextKwargs(TypedDict, total=False): | |
The side on which padding will be applied. | ||
""" | ||
|
||
text_pair: Optional[Union[TextInput, PreTokenizedInput, List[TextInput], List[PreTokenizedInput]]] | ||
text_target: Union[TextInput, PreTokenizedInput, List[TextInput], List[PreTokenizedInput]] | ||
text_pair_target: Optional[Union[TextInput, PreTokenizedInput, List[TextInput], List[PreTokenizedInput]]] | ||
add_special_tokens: Optional[bool] | ||
padding: Union[bool, str, PaddingStrategy] | ||
truncation: Union[bool, str, TruncationStrategy] | ||
|
@@ -328,6 +333,7 @@ class ProcessorMixin(PushToHubMixin): | |
|
||
attributes = ["feature_extractor", "tokenizer"] | ||
optional_attributes = ["chat_template"] | ||
optional_call_args: List[str] = [] | ||
# Names need to be attr_class for attr in attributes | ||
feature_extractor_class = None | ||
tokenizer_class = None | ||
|
@@ -973,6 +979,64 @@ def validate_init_kwargs(processor_config, valid_kwargs): | |
unused_kwargs = {k: processor_config[k] for k in unused_keys} | ||
return unused_kwargs | ||
|
||
def prepare_and_validate_optional_call_args(self, *args): | ||
""" | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Very nice docstring :) There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. That's from @leloykun :) |
||
Matches optional positional arguments to their corresponding names in `optional_call_args` | ||
in the processor class in the order they are passed to the processor call. | ||
|
||
Note that this should only be used in the `__call__` method of the processors with special | ||
arguments. Special arguments are arguments that aren't `text`, `images`, `audio`, nor `videos` | ||
but also aren't passed to the tokenizer, image processor, etc. Examples of such processors are: | ||
- `CLIPSegProcessor` | ||
- `LayoutLMv2Processor` | ||
- `OwlViTProcessor` | ||
|
||
Also note that passing by position to the processor call is now deprecated and will be disallowed | ||
in future versions. We only have this for backward compatibility. | ||
|
||
Example: | ||
Suppose that the processor class has `optional_call_args = ["arg_name_1", "arg_name_2"]`. | ||
And we define the call method as: | ||
```python | ||
def __call__( | ||
self, | ||
text: str, | ||
images: Optional[ImageInput] = None, | ||
*arg, | ||
audio=None, | ||
videos=None, | ||
) | ||
``` | ||
|
||
Then, if we call the processor as: | ||
```python | ||
images = [...] | ||
processor("What is common in these images?", images, arg_value_1, arg_value_2) | ||
``` | ||
|
||
Then, this method will return: | ||
```python | ||
{ | ||
"arg_name_1": arg_value_1, | ||
"arg_name_2": arg_value_2, | ||
} | ||
``` | ||
which we could then pass as kwargs to `self._merge_kwargs` | ||
""" | ||
if len(args): | ||
warnings.warn( | ||
"Passing positional arguments to the processor call is now deprecated and will be disallowed in v4.47. " | ||
"Please pass all arguments as keyword arguments." | ||
) | ||
if len(args) > len(self.optional_call_args): | ||
raise ValueError( | ||
f"Expected *at most* {len(self.optional_call_args)} optional positional arguments in processor call" | ||
f"which will be matched with {' '.join(self.optional_call_args)} in the order they are passed." | ||
f"However, got {len(args)} positional arguments instead." | ||
"Please pass all arguments as keyword arguments instead (e.g. `processor(arg_name_1=..., arg_name_2=...))`." | ||
) | ||
return {arg_name: arg_value for arg_value, arg_name in zip(args, self.optional_call_args)} | ||
|
||
def apply_chat_template( | ||
self, | ||
conversation: Union[List[Dict[str, str]]], | ||
|
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. thanks for cleaning these up! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@zucchini-nlp It seems that the Unpack was in the init instead of the call function of the processor. Do you think it's worth a separate PR or can we include it here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
oops, yes, can be here, thanks! Regarding adding
audio=None
before video, while audio won't be used for this model: seems to be counterintuitive to me.@molbap welcome back! We had questions about changing order of main input args in your absence, and this is also distantnly related to that. Should we be adding unused args with this order?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
hey! glad to be back, github notifications are hefty 😁
keeping audio=None is a bit strange, but it's the price for having always the same signature of constant size. I'd be pro-keeping always the same order as well. This is because in terms of integration of transformers in other tools, it makes things (a bit) easier, and would allow more future impact for processors. But if you think of something smart along the lines of informing a user of which modality/types are accepted by the model, I'm open to it!
edit: pressed enter too soon