Skip to content

Conversation

@dudududukim
Copy link

Problem

1. Current examples throw a 404 Client Error when using HuggingFaceInferenceAPI:

HfHubHTTPError: 404 Client Error: Not Found for url: https://router.huggingface.co/hf-inference/models/Qwen/Qwen2.5-Coder-32B-Instruct/v1/chat/completions

See error examples in my repository notebooks:

This happens because Hugging Face discontinued free serverless inference for most models in 2024. The model now requires a provider parameter.

2. DuckDuckGoSearchRun was incorrectly imported from tools module instead of langchain_community.tools

Changes

1.

Added provider="auto" to HuggingFaceInferenceAPI in:

  • units/en/unit2/llama-index/components.mdx
  • units/en/unit3/agentic-rag/agent.mdx

Using provider="auto" lets users configure their preferred providers at https://huggingface.co/settings/inference-providers

2.

Changed import to from langchain_community.tools import DuckDuckGoSearchRun for correct initialization

References

@dudududukim
Copy link
Author

Dear maintainer/reviewer, could you please approve the workflow run? Thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant