Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AttributeError: 'str' object has no attribute 'tool' #28939

Open
5 tasks done
jason571 opened this issue Dec 27, 2024 · 4 comments
Open
5 tasks done

AttributeError: 'str' object has no attribute 'tool' #28939

jason571 opened this issue Dec 27, 2024 · 4 comments
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature

Comments

@jason571
Copy link

Checked other resources

  • I added a very descriptive title to this issue.
  • I searched the LangChain documentation with the integrated search.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).

Example Code

    self.prompt = config.get_prompt("system_monitor_template_test3")
    self.system = self.prompt
    graph = StateGraph(AgentState)
    graph.add_node("llm", self.call_agents)
    graph.add_node("action", self.take_action)
    graph.add_conditional_edges(
        "llm",
        self.exists_action,
        {True: "action", False: END}
    )
    self.llm = self.interface.get_current_model(0.2,0.2,10)
    self.create_tools()
    graph.add_edge("action", "llm")
    graph.set_entry_point("llm")
    self.memory = MemorySaver()
    self.graph = graph.compile(checkpointer=self.memory)
def create_agent_node(self):
    prompt_template = PromptTemplate(template=self.system)
    system_message = prompt_template.format()
    if self.agent_node is None:
        self.agent_node = create_react_agent(
            self.llm,
            self.tools,
            state_modifier=system_message,
        )
        self.agent_executor = AgentExecutor(
            agent=self.agent_node,
            tools=self.tools,
            verbose=True
        )
        
  ..............
    async def print_response(self, initial_message: str):
    messages = [("user", initial_message)]
    thread = {"configurable": {"thread_id": self.user_id}}
    async for event in self.graph.astream_events({"messages": messages}, thread, version="v2"):
        kind = event["event"]
        if kind == "on_chat_model_stream":
            content = event["data"]["chunk"].content
            if content:
                print(content, end="", flush=True)

Example usage:

if name == "main":
agent = Agent("12345")

input = "check CPU "
asyncio.run(agent.print_response(input))

Error Message and Stack Trace (if applicable)

Entering new AgentExecutor chain...
[2024-12-27 15:24:47,480]-INFO-[functionCall.py:28]: executing shell command:
top -b -n 1 | grep "Cpu(s)"
result output:%Cpu(s): 0.0 us, 0.0 sy, 0.0 ni,100.0 id, 0.0 wa, 0.0 hi, 0.0 si, 0.0 st

The current CPU usage is 0.0% for user, 0.0% for system, 0.0% for nice, 100.0% for idle, 0.0% for wait, 0.0% for hardware interrupt, 0.0% for software interrupt, and 0.0% for steal.[2024-12-27 15:24:48,379]-WARNING-[manager.py:287]: Error in StdOutCallbackHandler.on_agent_action callback: AttributeError("'str' object has no attribute 'log'")
Traceback (most recent call last):
File "/mnt/m/flyang/pr_train/LLMs/src/langgraphsearch/agentWorkflow.py", line 190, in
asyncio.run(agent.print_response(input))
File "/home/flyang/anaconda3/envs/LLMs/lib/python3.12/asyncio/runners.py", line 194, in run
return runner.run(main)
^^^^^^^^^^^^^^^^
File "/home/flyang/anaconda3/envs/LLMs/lib/python3.12/asyncio/runners.py", line 118, in run
return self._loop.run_until_complete(task)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/flyang/anaconda3/envs/LLMs/lib/python3.12/asyncio/base_events.py", line 687, in run_until_complete
return future.result()
^^^^^^^^^^^^^^^
File "/mnt/m/flyang/pr_train/LLMs/src/langgraphsearch/agentWorkflow.py", line 175, in print_response
async for event in self.graph.astream_events({"messages": messages}, thread, version="v2"):
File "/home/flyang/anaconda3/envs/LLMs/lib/python3.12/site-packages/langchain_core/runnables/base.py", line 1388, in astream_events
async for event in event_stream:
File "/home/flyang/anaconda3/envs/LLMs/lib/python3.12/site-packages/langchain_core/tracers/event_stream.py", line 1012, in _astream_events_implementation_v2
await task
File "/home/flyang/anaconda3/envs/LLMs/lib/python3.12/site-packages/langchain_core/tracers/event_stream.py", line 967, in consume_astream
async for _ in event_streamer.tap_output_aiter(run_id, stream):
File "/home/flyang/anaconda3/envs/LLMs/lib/python3.12/site-packages/langchain_core/tracers/event_stream.py", line 180, in tap_output_aiter
first = await py_anext(output, default=sentinel)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/flyang/anaconda3/envs/LLMs/lib/python3.12/site-packages/langchain_core/utils/aiter.py", line 76, in anext_impl
return await anext(iterator)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/flyang/anaconda3/envs/LLMs/lib/python3.12/site-packages/langgraph/pregel/init.py", line 1822, in astream
async for _ in runner.atick(
File "/home/flyang/anaconda3/envs/LLMs/lib/python3.12/site-packages/langgraph/pregel/runner.py", line 221, in atick
await arun_with_retry(
File "/home/flyang/anaconda3/envs/LLMs/lib/python3.12/site-packages/langgraph/pregel/retry.py", line 115, in arun_with_retry
async for _ in task.proc.astream(task.input, config):
File "/home/flyang/anaconda3/envs/LLMs/lib/python3.12/site-packages/langgraph/utils/runnable.py", line 576, in astream
async for chunk in aiterator:
File "/home/flyang/anaconda3/envs/LLMs/lib/python3.12/site-packages/langchain_core/tracers/event_stream.py", line 180, in tap_output_aiter
first = await py_anext(output, default=sentinel)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/flyang/anaconda3/envs/LLMs/lib/python3.12/site-packages/langchain_core/utils/aiter.py", line 76, in anext_impl
return await anext(iterator)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/flyang/anaconda3/envs/LLMs/lib/python3.12/site-packages/langchain_core/runnables/base.py", line 1455, in atransform
async for ichunk in input:
File "/home/flyang/anaconda3/envs/LLMs/lib/python3.12/site-packages/langchain_core/runnables/base.py", line 1455, in atransform
async for ichunk in input:
File "/home/flyang/anaconda3/envs/LLMs/lib/python3.12/site-packages/langchain_core/runnables/base.py", line 1455, in atransform
async for ichunk in input:
File "/home/flyang/anaconda3/envs/LLMs/lib/python3.12/site-packages/langchain_core/runnables/base.py", line 1018, in astream
yield await self.ainvoke(input, config, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/flyang/anaconda3/envs/LLMs/lib/python3.12/site-packages/langgraph/utils/runnable.py", line 236, in ainvoke
ret = await asyncio.create_task(coro, context=context)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/flyang/anaconda3/envs/LLMs/lib/python3.12/site-packages/langchain_core/runnables/config.py", line 588, in run_in_executor
return await asyncio.get_running_loop().run_in_executor(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/flyang/anaconda3/envs/LLMs/lib/python3.12/concurrent/futures/thread.py", line 58, in run
result = self.fn(*self.args, **self.kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/flyang/anaconda3/envs/LLMs/lib/python3.12/site-packages/langchain_core/runnables/config.py", line 579, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/mnt/m/flyang/pr_train/LLMs/src/langgraphsearch/agentWorkflow.py", line 110, in call_agents
result = self.agent_executor.invoke({"messages": messages})
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/flyang/anaconda3/envs/LLMs/lib/python3.12/site-packages/langchain/chains/base.py", line 170, in invoke
raise e
File "/home/flyang/anaconda3/envs/LLMs/lib/python3.12/site-packages/langchain/chains/base.py", line 160, in invoke
self._call(inputs, run_manager=run_manager)
File "/home/flyang/anaconda3/envs/LLMs/lib/python3.12/site-packages/langchain/agents/agent.py", line 1624, in _call
next_step_output = self._take_next_step(
^^^^^^^^^^^^^^^^^^^^^
File "/home/flyang/anaconda3/envs/LLMs/lib/python3.12/site-packages/langchain/agents/agent.py", line 1332, in _take_next_step
for a in self._iter_next_step(
^^^^^^^^^^^^^^^^^^^^^
File "/home/flyang/anaconda3/envs/LLMs/lib/python3.12/site-packages/langchain/agents/agent.py", line 1415, in _iter_next_step
yield self._perform_agent_action(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/flyang/anaconda3/envs/LLMs/lib/python3.12/site-packages/langchain/agents/agent.py", line 1429, in _perform_agent_action
if agent_action.tool in name_to_tool_map:
^^^^^^^^^^^^^^^^^
AttributeError: 'str' object has no attribute 'tool'

Description

AgentExecutor
AttributeError: 'str' object has no attribute 'tool'

System Info

System Information

OS: Linux
OS Version: #1 SMP Tue Nov 5 00:21:55 UTC 2024
Python Version: 3.12.7 | packaged by Anaconda, Inc. | (main, Oct 4 2024, 13:27:36) [GCC 11.2.0]

Package Information

langchain_core: 0.3.28
langchain: 0.3.12
langchain_community: 0.3.9
langsmith: 0.1.143
langchain_anthropic: 0.3.0
langchain_ark: 0.1.4
langchain_cli: 0.0.35
langchain_experimental: 0.3.3
langchain_google_community: 2.0.2
langchain_google_genai: 2.0.7
langchain_groq: 0.2.1
langchain_huggingface: 0.1.2
langchain_milvus: 0.1.7
langchain_openai: 0.2.11
langchain_text_splitters: 0.3.3
langchain_together: 0.2.0
langchain_xai: 0.1.1
langgraph_sdk: 0.1.36
langserve: 0.3.0

Other Dependencies

aiohttp: 3.11.2
anthropic: 0.40.0
async-timeout: Installed. No version info available.
beautifulsoup4: 4.12.3
dataclasses-json: 0.6.7
db-dtypes: Installed. No version info available.
defusedxml: 0.7.1
fastapi: 0.115.5
filetype: 1.2.0
gapic-google-longrunning: Installed. No version info available.
gitpython: 3.1.43
google-api-core: 2.23.0
google-api-python-client: 2.153.0
google-auth-httplib2: 0.2.0
google-auth-oauthlib: Installed. No version info available.
google-cloud-aiplatform: Installed. No version info available.
google-cloud-bigquery: Installed. No version info available.
google-cloud-bigquery-storage: Installed. No version info available.
google-cloud-contentwarehouse: Installed. No version info available.
google-cloud-core: 2.4.1
google-cloud-discoveryengine: Installed. No version info available.
google-cloud-documentai: Installed. No version info available.
google-cloud-documentai-toolbox: Installed. No version info available.
google-cloud-speech: Installed. No version info available.
google-cloud-storage: Installed. No version info available.
google-cloud-texttospeech: Installed. No version info available.
google-cloud-translate: Installed. No version info available.
google-cloud-vision: Installed. No version info available.
google-generativeai: 0.8.3
googlemaps: Installed. No version info available.
gritql: 0.1.5
groq: 0.13.0
grpcio: 1.67.1
httpx: 0.27.2
httpx-sse: 0.4.0
huggingface-hub: 0.26.2
jsonpatch: 1.33
langserve[all]: Installed. No version info available.
numpy: 2.2.0
openai: 1.54.4
orjson: 3.10.11
packaging: 24.2
pandas: 2.2.3
pyarrow: 18.1.0
pydantic: 2.9.2
pydantic-settings: 2.6.1
pymilvus: 2.5.0
PyYAML: 6.0.2
requests: 2.32.3
requests-toolbelt: 1.0.0
sentence-transformers: 3.3.0
SQLAlchemy: 2.0.35
sse-starlette: 1.8.2
tenacity: 9.0.0
tiktoken: 0.8.0
tokenizers: 0.20.3
tomlkit: 0.12.0
transformers: 4.46.2
typer[all]: Installed. No version info available.
typing-extensions: 4.12.2
uvicorn: 0.32.0
volcengine-python-sdk[ark]: Installed. No version info available.

@dosubot dosubot bot added the 🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature label Dec 27, 2024
@jason571
Copy link
Author

    self.prompt = config.get_prompt("system_monitor_template_test3")
    self.system = self.prompt
    graph = StateGraph(AgentState)
    graph.add_node("llm", self.call_agents)
    graph.add_node("action", self.take_action)
    graph.add_conditional_edges(
        "llm",
        self.exists_action,
        {True: "action", False: END}
    )
    self.llm = self.interface.get_current_model(0.2,0.2,10)
    self.create_tools()
    graph.add_edge("action", "llm")
    graph.set_entry_point("llm")
    #self.memory = SqliteSaver.from_conn_string(":memory:")
    self.memory = MemorySaver()
    self.graph = graph.compile(checkpointer=self.memory)

def create_tools(self):
    """Creates tools for the agent"""
    retriever = self.interface.get_retriever()
    function = FunctionCall(retriever)
    self.tools = function.get_tools()
    self.toolsData = {t.name: t for t in self.tools}

def exists_action(self, state: AgentState):
    result = state['messages'][-1]
    return len(result.tool_calls) > 0

def call_agents(self, state: AgentState):
    if not self.agent_executor:
        self.create_agent_node()
    messages = state.get('messages', [])
    mylogging.info(messages)
    result = self.agent_executor.invoke({"messages": messages})
    return {'messages': [result]}

@jason571
Copy link
Author

This error looks like the LLM has already returned

@jason571
Copy link
Author

This may be my wrong way of using it, the issue can be close

@keenborder786
Copy link
Contributor

tools=self.tools should be a list of BaseTools class

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature
Projects
None yet
Development

No branches or pull requests

2 participants