You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Mar 1, 2024. It is now read-only.
I am trying to work through a script for indexing a github repo. I determined that the script was written for an earlier version of llama_index (pre 0.10) I thought I would try and bring it up to date with the latest version of llama_index and llama_hub.
I installed both llama-index and llama-hub via pip and pypi
These are the imports at the top of my script ( and the place where my script is erroring)
import os
import textwrap
from dotenv import load_dotenv
from llama_index.legacy import download_loader
from llama_hub.github_repo import GithubRepositoryReader, GithubClient
from llama_index.core import VectorStoreIndex
from llama_index.legacy.vector_stores import DeepLakeVectorStore
from llama_index.core.storage.storage_context import StorageContext
import re
This is the error I am getting
Traceback (most recent call last):
File "C:\Users\aaols\PycharmProjects\experiments\llamaindex_activeloop_vectorize_data_from_github.py", line 12, in <module>
from llama_hub.github_repo import GithubRepositoryReader, GithubClient
File "C:\Users\aaols\PycharmProjects\experiments\venv\lib\site-packages\llama_hub\github_repo\__init__.py", line 2, in <module>
from llama_hub.github_repo.base import (
File "C:\Users\aaols\PycharmProjects\experiments\venv\lib\site-packages\llama_hub\github_repo\base.py", line 18, in <module>
from llama_index.readers.base import BaseReader
ModuleNotFoundError: No module named 'llama_index.readers.base'
which looks like the latest version of llama_hub in pypi is not yet aware of the changes in llama_index.
This is a case where llama_hub relies on a specific version (or range of versions) of llama_index and this should really be called out in the deps. https://github.com/run-llama/llama-hub/blob/main/pyproject.toml#L19
Should likely be changed to llama-index = ">=0.9.41, <0.10.0"
as llama-index is a dependency of llama-hub, and if this is a known place of incompatibility, this should be called out in the pyproject.toml
Version
0.10.6
Steps to Reproduce
install the same versions of llama-index and llama-hub as noted above
Relevant Logs/Tracbacks
No response
The text was updated successfully, but these errors were encountered:
With the launch of LlamaIndex v0.10, we are deprecating this llama_hub repo - all integrations (data loaders, tools) and packs are now in the core llama-index Python repository. LlamaHub will continue to exist. We are revamping llamahub.ai point to all integrations/packs/datasets available in the llama-index repo.
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Bug Description
I am trying to work through a script for indexing a github repo. I determined that the script was written for an earlier version of llama_index (pre 0.10) I thought I would try and bring it up to date with the latest version of llama_index and llama_hub.
I installed both llama-index and llama-hub via pip and pypi
This is the
pip freeze
of my llama depsThese are the imports at the top of my script ( and the place where my script is erroring)
This is the error I am getting
which looks like the latest version of llama_hub in pypi is not yet aware of the changes in llama_index.
This is a case where llama_hub relies on a specific version (or range of versions) of llama_index and this should really be called out in the deps. https://github.com/run-llama/llama-hub/blob/main/pyproject.toml#L19
Should likely be changed to
llama-index = ">=0.9.41, <0.10.0"
as llama-index is a dependency of llama-hub, and if this is a known place of incompatibility, this should be called out in the pyproject.toml
Version
0.10.6
Steps to Reproduce
install the same versions of llama-index and llama-hub as noted above
Relevant Logs/Tracbacks
No response
The text was updated successfully, but these errors were encountered: