-
-
Notifications
You must be signed in to change notification settings - Fork 12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to use this with a downloaded model? #9
Comments
I've met the same problem. The meta.json can be found in './spacy_universal_sentence_encoder/meta/ '.But still, since i'm in china, it will raise the urlopen error, which means it fails to open the downloaded model .Have you solved it? |
Hi @jrruethe , I did not think about this case, where you already have a model downloaded, when first developing this wrapper for spacy. A workaround that should work for you, is the following:
The path is composed of:
For a less dirty solution, I should modify the part of code where I set the folder for the TensorFlow models https://github.com/MartinoMensio/spacy-universal-sentence-encoder/blob/master/spacy_universal_sentence_encoder/language.py#L71 For @tanghaoyu258, if you already have downloaded the model and have the same issue with docker, I think this workaround should work also for you. If you have already the model downloaded, you can then copy/symlink it to the folder Otherwise, if you want a pre-packaged model not depending on the network, this is a different issue which I will try to address. Best, |
Thank you for the response! I wasn't aware of where the model is stored or how the sha1 piece of the path was formed, so that was very helpful. I am pretty sure your suggestion of mounting the model into the container will work perfectly for my needs. I'll try it out shortly and let you know how it goes. Thanks again! |
I can confirm that this works for my use-case. But I wanted to add some notes for you or the next person to come along. I am using
The path was slightly different, but I figured it out. This allows me to load the model just fine. Some other caveats that I noticed. I am using Spacy 2.3.2, and I found that when I install your version 0.2.3 using the following, everything works fine:
However, when I install your version 0.3.1 using this command:
Then when I attempt to load the library, I get the following error:
I wanted to let you know, in case this is unexpected. For now, my original issue is solved and I am unblocked, so feel free to close this. Thanks again! I appreciate the help. |
Thanks @jrruethe for finding the path of the library installed on your docker container. For your issue with the factory for
The check on dependencies is done when installing the standalone models (which you have installed first) and then you probably updated To check the installed versions you can run: python -c "import spacy_universal_sentence_encoder; print(spacy_universal_sentence_encoder.__version__)"
python -c "import en_use_lg; print(en_use_lg.__version__)" Using the same version number should solve your problem. Martino |
I just released For the docker case, you can modify the
The command will map the volume to a custom directory, which is also sent as environment variable so that TensorFlow Hub can locate it. Martino |
Ha, you are exactly right about my mismatched versions! I had intended to update both, but I think my image was docker-cached during the build. Thanks again for the info, this is extremely helpful! |
Hi @MartinoMensio, |
Hi @clippered ,
I tried reproducing the issue on my machine by removing write access to the folder where I have the models, but the I have no idea on how to reproduce the issue locally. Can you try to update to the latest version of tensorflow (2.4.0) which is supported by this library (version 0.4.0 just released) and see if the issue persists? I found that there exists something called "file system access for lambda functions" https://docs.aws.amazon.com/lambda/latest/dg/configuration-filesystem.html so maybe this page could be related to your issue. Martino |
Thanks for your response @MartinoMensio. Unfortunately, I still got the same Attaching EFS to AWS Lambda can be another option, like what you suggested. However, I would prefer to bake everything into the container's image instead as to not use any more AWS resources. It is weird that you cannot reproduce it. Maybe another way to reproduce it is, download the model using a different user, making sure the location it stored it has readonly access for all other users, then run the python script to load the model using another user. Anyway, don't worry too much about this and thanks for your help. |
Just got the traceback for reference.
|
Hello,
I am trying to use this inside of a Docker image. I have downloaded the model separately from here, and I have performed the
pip install https://github.com/MartinoMensio/spacy-universal-sentence-encoder/releases/download/v0.3.1/en_use_lg-0.3.1.tar.gz#en_use_lg-0.3.1
as aRUN
command in my Dockerfile.I want to be able to load and use the model that I downloaded externally, but I don't know how to load it. When I use the following, it redownloads the model; I don't know how to tell it to load the one I downloaded manually and added to the Docker image:
I tried the following:
Both complain that it cannot find
meta.json
.Can you help me?
Thanks!
The text was updated successfully, but these errors were encountered: