Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can the basemodel be loaded directly from the local system without connecting to Hugging Face? It cannot be launched because there is no connection. #261

Open
2 tasks
Loovelj opened this issue Feb 20, 2024 · 3 comments
Labels
question Further information is requested

Comments

@Loovelj
Copy link

Loovelj commented Feb 20, 2024

Model description

model=Qwen/Qwen-7B-Chat volume=/hub_models/Qwen-7B-Chat/ docker run --gpus=1 --shm-size 1g -p 8080:80 -v $volume:/data \ ghcr.nju.edu.cn/predibase/lorax:latest --model-id $model

if load the local base model directly will be very well

Open source status

  • The model implementation is available
  • The model weights are available

Provide useful links for the implementation

No response

@jeffreyftang
Copy link
Contributor

Hi @Loovelj, can you provide the exact error message(s) you're running into?

It also looks like you're setting $model to be a HF-like model name. Can you try setting it to an absolute local path and seeing if that helps?

@jeffreyftang jeffreyftang added the question Further information is requested label Feb 22, 2024
@Loovelj
Copy link
Author

Loovelj commented Feb 27, 2024

Referen
It didn't work, model must repo_name
huggingface_hub.utils._validators.HFValidationError: Repo id must be in the form 'repo_name' or 'namespace/repo_name': '

@nirvitarka
Copy link

I am facing the same situation, need to load model from local directory. Is any solution found for this?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

3 participants