You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I searched the LangGraph/LangChain documentation with the integrated search.
I used the GitHub search to find a similar question and didn't find it.
I am sure that this is a bug in LangGraph/LangChain rather than my code.
I am sure this is better as an issue rather than a GitHub discussion, since this is a LangGraph bug and not a design question.
Example Code
fromlanggraph.graphimportMessagesStatefromlanggraph.graphimportStateGraph, START, ENDfromlanggraph.prebuiltimportToolNode, tools_conditionfromlangchain_huggingface.llmsimportHuggingFaceEndpoint# Tooldefmultiply(a: int, b: int) ->int:
"""Multiplies a and b. Args: a: first int b: second int """returna*bllm=HuggingFaceEndpoint(
endpoint_url='url',
server_kwargs={
"headers": {"Content-Type": "application/json"}
}
)
llm_with_tools=llm.bind_tools([multiply])
Error Message and Stack Trace (if applicable)
error | Traceback (most recent call last):
File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 693, in lifespan
async with self.lifespan_context(app) as maybe_state:
File "/usr/local/lib/python3.11/contextlib.py", line 210, in __aenter__
return await anext(self.gen)
^^^^^^^^^^^^^^^^^^^^^
File "/api/langgraph_api/lifespan.py", line 23, in lifespan
File "/api/langgraph_api/shared/graph.py", line 205, in collect_graphs_from_env
File "/usr/local/lib/python3.11/site-packages/langchain_core/runnables/config.py", line 588, in run_in_executor
return await asyncio.get_running_loop().run_in_executor(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 58, in run
result = self.fn(*self.args, **self.kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/langchain_core/runnables/config.py", line 579, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/api/langgraph_api/shared/graph.py", line 233, in _graph_from_spec
File "<frozen importlib._bootstrap_external>", line 940, in exec_module
File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File "/deps/__outer_module-1-studio-copy/src/router_tgi.py", line 34, in<module>
llm_with_tools = llm.bind_tools([multiply])
^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/pydantic/main.py", line 856, in __getattr__
raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}')
AttributeError: 'HuggingFaceEndpoint' object has no attribute 'bind_tools'
Description
I am trying to recreate the LangGraph router example using an LLM hosted in a local HuggingFace text-generation-inference server. Per TGI documentation, tool calling is officially supported.
However, replacing the llm = with a reference to the local TGI server produces the problem in the attached code example, namely, a HuggingFaceEndpoint object has no attribute called bind_tools.
File "/usr/local/lib/python3.11/site-packages/pydantic/main.py", line 856, in __getattr__
raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}')
AttributeError: 'HuggingFaceEndpoint' object has no attribute 'bind_tools'
This is not a bug, but a feature that's not supported by the huggingface endpoint.
@efriis this is a good candidate for error codes page
Checked other resources
Example Code
Error Message and Stack Trace (if applicable)
Description
I am trying to recreate the LangGraph router example using an LLM hosted in a local HuggingFace text-generation-inference server. Per TGI documentation, tool calling is officially supported.
However, replacing the
llm =
with a reference to the local TGI server produces the problem in the attached code example, namely, a HuggingFaceEndpoint object has no attribute calledbind_tools
.System Info
Pip Freeze:
iohappyeyeballs==2.4.0
aiohttp==3.10.5
aiosignal==1.3.1
annotated-types==0.7.0
anyio==4.6.0
attrs==24.2.0
certifi==2024.8.30
charset-normalizer==3.3.2
dataclasses-json==0.6.7
distro==1.9.0
filelock==3.16.1
frozenlist==1.4.1
fsspec==2024.9.0
h11==0.14.0
httpcore==1.0.5
httpx==0.27.2
huggingface-hub==0.25.0
idna==3.10
Jinja2==3.1.4
jiter==0.5.0
joblib==1.4.2
jsonpatch==1.33
jsonpointer==3.0.0
langchain==0.3.0
langchain-community==0.3.0
langchain-core==0.3.5
langchain-huggingface==0.1.0
langchain-ollama==0.2.0
langchain-openai==0.2.0
langchain-text-splitters==0.3.0
langgraph==0.2.23
langgraph-checkpoint==1.0.10
langsmith==0.1.125
MarkupSafe==2.1.5
marshmallow==3.22.0
mpmath==1.3.0
msgpack==1.1.0
multidict==6.1.0
mypy-extensions==1.0.0
networkx==3.3
numpy==1.26.4
ollama==0.3.3
openai==1.47.0
orjson==3.10.7
packaging==24.1
pillow==10.4.0
pydantic==2.9.2
pydantic-settings==2.5.2
pydantic_core==2.23.4
python-dotenv==1.0.1
PyYAML==6.0.2
regex==2024.9.11
requests==2.32.3
safetensors==0.4.5
scikit-learn==1.5.2
scipy==1.14.1
sentence-transformers==3.1.1
sniffio==1.3.1
SQLAlchemy==2.0.35
sympy==1.13.3
tenacity==8.5.0
threadpoolctl==3.5.0
tiktoken==0.7.0
tokenizers==0.19.1
torch==2.4.1
tqdm==4.66.5
transformers==4.44.2
typing-inspect==0.9.0
typing_extensions==4.12.2
urllib3==2.2.3
yarl==1.11.1
Platform: Mac (OS: Sequoia 15.0)
Python Version: 3.11.9
The text was updated successfully, but these errors were encountered: