Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

IPEX-LLM GPU - GraphRAG: No module named "past" && ApiKeyMissingError && No text files found in input #11983

Open
Kpeacef opened this issue Sep 1, 2024 · 2 comments
Assignees

Comments

@Kpeacef
Copy link

Kpeacef commented Sep 1, 2024

Hi, I am following IPEX-LLM GraphRAG_quickstart.md, I met two issues.

  1. No module named "past"

Prepare Input Corpus
Some sample documents are used here as input corpus for indexing GraphRAG, based on which LLM will create a knowledge graph.

Perpare the input corpus, and then initialize the workspace:

For Linux users:

define inputs corpus

mkdir -p ./ragtest/input
cp input/* ./ragtest/input

export no_proxy=localhost,127.0.0.1

initialize ragtest folder

python -m graphrag.index --init --root ./ragtest

Raised ModuleNotFoundError: No module named 'past'

I have checked internet and suggested to use
pip install future
to resolve the issue. Just an awareness, maybe you would like to update the documentation?

  1. Raise ApiKeyMissingError graphrag.config.error.ApiKeyMissingError: API Key is required for Completion API. Please set either the OPENAI_API_KEY, GRAPHRAG_API_KEY or GRAPHRAG_LLM_API_KEY environment variable.

Conduct GraphRAG indexing
Finally, conduct GraphRAG indexing, which may take a while:

python -m graphrag.index --root ragtest

Please let me know if API is required

Method to resolve this issue:
export GRAPHRAG_API_KEY=1234, following TheAiSingularity/graphrag-local-ollama#7

  1. ValueError: No text files found in input
    After resolving the issue above, encounter another error No text files found in input
@Oscilloscope98
Copy link
Contributor

Hi @Kpeacef,

Thank you for the suggestions on our documentation. We are reproducing the issues, and will let you know for any updates :)

@Oscilloscope98
Copy link
Contributor

Oscilloscope98 commented Sep 3, 2024

Hi @Kpeacef ,

We updated our GraphRAG QuickStart. You could follow it and have a try again :)

Here are a few points you might want to pay attention to:

  1. IPEX-LLM Ollama used a separate Python environment from graphrag-local-ollama
  2. Make sure ipex-llm[cpp]==2.1.0 is installed for Ollama serve
  3. A troubleshooting section is added to the QuickStart to refer to

Please let us know for any further problems :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants