-
Notifications
You must be signed in to change notification settings - Fork 569
♻️ Export individual functions, precise typing #148
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
vvmnnnkv
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Really nice to have each task function in its own file!
Some suggestions:
- Put task functions files to a folder like
tasks, so they are not mixed with internal methods - Perhaps even split them to sub-folders like
nlp,cv,multimodal, etc. following HF models classification - Move args/response types to corresponding task function file
👍
And we could move
The thing is that some types are reused across tasks. Although we could duplicate the types I guess :) More future-proof Good suggestions ^^ |
9640108 to
2168123
Compare
|
Merging to unblock parallel PRs, waiting a bit before release, feel free to comment in the meantime |
cc @Grsmto @vvmnnnkv @TimMikeladze
Inference endpoints do not accept
modelanymore.Also, individual functions for the various tasks are exported, taking an additional optional
accessTokenparameter.This is a big refacto even if the API keeps backwards compatibility, so I'd like some input :)
This moves the Inference API closer to
@huggingface/hub, and conversely, we can also move the@huggingface/hubapi closer to the inference module by offering anew HfHub("hf_...")class