Skip to content

Resume to a specific subgraph node after interrupt #30518

Open
@PetrosTragoudaras

Description

@PetrosTragoudaras

Checked other resources

  • I added a very descriptive title to this issue.
  • I searched the LangChain documentation with the integrated search.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).

Example Code

Description
I am working on a chatbot utilizing interrupt_before in subgraphs. My subgraph has 2 interrupt points.

Behavior:

The first interrupt point works as expected, resuming before the interrupted node

However, when the 2nd interrupt (func_interrupt_second_time) is triggered and the graph is re-invoked, it resumes before the interrupt node instead of progressing forward. This leads to an infinite loop where the flow never reaches the subsequent nodes.

What I Have Tried
I attempted to retrieve the subgraph state history as described in the documentation.

The parent state is retrieved successfully.
However, the subgraph state history appears empty, preventing me from resuming execution.

Expected Behavior
When func_interrupt_second_time is triggered, the graph should resume execution from this node and progress to func_validate instead of reverting before func_interrupt_second_time and interrupt again.

I am creating this issue to understand if i am doing something wrong in the implementation or if this if a known issue or if there are any help someone can provide in handling the resume after interrupt properly.
Thank you in advance!

Client Invocation (REST Endpoint):

@router.post("/api/chat", response_model=ChatResponse)
async def chat(request: ChatRequest, memory: MemorySaver = Depends(get_memory_saver)):
    config: RunnableConfig = {"configurable": {"thread_id": request.threadId}}
    graph = get_main_graph(memory)
    prompt = starting_system_prompt()
    state = graph.get_state(config)
    start_time = time.time()
    logger.info(f"Message received")

    if not state.values:
        input_messages = [SystemMessage(prompt), HumanMessage(request.message)]
        output = graph.invoke(
            {"messages": input_messages},
            config=config,
            subgraphs=True,
        )
    else:
        # get graph state
        interrupted_state_snapsot = graph.get_state(config, subgraphs=True)
        subgraph_config = interrupted_state_snapsot.tasks[0].state.config

        # update graph state messages with new human message
        current_messages = interrupted_state_snapsot.values.get("messages")
        current_messages.append(HumanMessage(request.message))
        graph.update_state(config, {"messages": current_messages})

        # update subgraph state messages with new human message
        graph.update_state(subgraph_config, {"messages": current_messages})

        # re-invoke graph from interrupted node
        if interrupted_state_snapsot.next:
            output = graph.invoke(
                None,
                config=subgraph_config,
                subgraphs=True,
            )
            # update graph state messages with new AI message
            interrupted_state_snapsot = graph.get_state(config, subgraphs=True)
            current_messages = interrupted_state_snapsot.values.get("messages")
            current_messages.append(AIMessage(output[1]["messages"][-1].content))
            graph.update_state(config, {"messages": current_messages})
            # update subgraph state messages with new AI message
            graph.update_state(subgraph_config, {"messages": current_messages})

    end_time = time.time()
    logger.info(f"Time taken to respond: {end_time - start_time:.2f} seconds")
    return ChatResponse(response=output[1]["messages"][-1].content)

main graph example

def get_main_graph(memory: MemorySaver):
    graph = StateGraph(state_schema=ConversationState)
    # add nodes
    graph.add_node("subgraph1", subgraph1)
    graph.add_node("subgraph2", subgraph2)
    graph.add_node("subgraph3", subgraph3)
    graph.add_node("subgraph4", subgraph4)
    graph.add_edge(START, "subgraph1")
    graph.add_edge("subgraph1", "subgraph2")
    graph.add_edge("subgraph2", "subgraph3")
    graph.add_edge("subgraph3", "subgraph4")
    graph.add_edge("subgraph2", END)
    return graph.compile(checkpointer=memory)

subgraph that i have the issue with:

def func(state: ConversationState):
    print("Node", "func")
    return {"messages": AIMessage("1st interrupt after this one")}


def func_interrupt(state: ConversationState):
    print("1st interrupt before this one")
    return


def func_2(state: ConversationState):
    print("Node", "2 again")
    return {"messages": AIMessage("next node after 1st interrupt.")}


def func_interrupt_second_time(state: ConversationState):
    print("2nd interrupt before this one")
    return


def func_validate(state: ConversationState):
    print("Node", "Final node") # this node is never reached
    return


subgraph2_builder = StateGraph(ConversationState)
subgraph2_builder.add_node("func", func)
subgraph2_builder.add_node("func_interrupt", func_interrupt)
subgraph2_builder.add_node("func_2", func_2)
subgraph2_builder.add_node("func_interrupt_second_time", func_interrupt_second_time)
subgraph2_builder.add_node("func_validate", func_validate)
subgraph2_builder.add_edge(START, "func")
subgraph2_builder.add_edge("func", "func_interrupt")
subgraph2_builder.add_edge("func_interrupt", "func_2")
subgraph2_builder.add_edge("func_2", "func_interrupt_second_time")
subgraph2_builder.add_edge("func_interrupt_second_time", "func_validate")
subgraph2 = subgraph2_builder.compile(checkpointer=True, interrupt_before=["func_interrupt", "func_interrupt_second_time"])

Error Message and Stack Trace (if applicable)

No error messages or StackTrace.

Description

The first interrupt point works as expected, resuming before the interrupted node

However, when the 2nd interrupt (func_interrupt_second_time) is triggered and the graph is re-invoked, it resumes from the interrupt node instead of progressing forward. This leads to an infinite loop where the flow never reaches the subsequent nodes.

What I Have Tried
I attempted to retrieve the subgraph state history as described in the documentation.

The parent state is retrieved successfully.
However, the subgraph state history appears empty, preventing me from resuming execution.

Expected Behavior
When func_interrupt_second_time is triggered, the graph should resume execution from this node and progress to func_validate instead of reverting before func_interrupt_second_time and interrupt again.

I am creating this issue to understand if i am doing something wrong in the implementation or if this if a known issue or if there are any help someone can provide in handling the resume after interrupt properly.
Thank you in advance!

System Info

System Information

OS: Linux
OS Version: #1 SMP Mon Feb 24 16:35:16 UTC 2025
Python Version: 3.12.9 (main, Feb 25 2025, 08:58:51) [GCC 12.2.0]

Package Information

langchain_core: 0.3.43
langchain: 0.3.20
langchain_community: 0.3.19
langsmith: 0.3.13
langchain_huggingface: 0.1.2
langchain_ollama: 0.2.3
langchain_postgres: 0.0.13
langchain_text_splitters: 0.3.6
langgraph_sdk: 0.1.55

Optional packages not installed

langserve

Other Dependencies

aiohttp<4.0.0,>=3.8.3: Installed. No version info available.
async-timeout<5.0.0,>=4.0.0;: Installed. No version info available.
dataclasses-json<0.7,>=0.5.7: Installed. No version info available.
httpx: 0.28.1
httpx-sse<1.0.0,>=0.4.0: Installed. No version info available.
huggingface-hub: 0.29.3
jsonpatch<2.0,>=1.33: Installed. No version info available.
langchain-anthropic;: Installed. No version info available.
langchain-aws;: Installed. No version info available.
langchain-cohere;: Installed. No version info available.
langchain-community;: Installed. No version info available.
langchain-core<1.0.0,>=0.3.34: Installed. No version info available.
langchain-core<1.0.0,>=0.3.41: Installed. No version info available.
langchain-deepseek;: Installed. No version info available.
langchain-fireworks;: Installed. No version info available.
langchain-google-genai;: Installed. No version info available.
langchain-google-vertexai;: Installed. No version info available.
langchain-groq;: Installed. No version info available.
langchain-huggingface;: Installed. No version info available.
langchain-mistralai;: Installed. No version info available.
langchain-ollama;: Installed. No version info available.
langchain-openai;: Installed. No version info available.
langchain-text-splitters<1.0.0,>=0.3.6: Installed. No version info available.
langchain-together;: Installed. No version info available.
langchain-xai;: Installed. No version info available.
langchain<1.0.0,>=0.3.20: Installed. No version info available.
langsmith-pyo3: Installed. No version info available.
langsmith<0.4,>=0.1.125: Installed. No version info available.
langsmith<0.4,>=0.1.17: Installed. No version info available.
numpy: 2.2.3
numpy<3,>=1.26.2: Installed. No version info available.
ollama: 0.4.7
orjson: 3.10.15
packaging: 24.2
packaging<25,>=23.2: Installed. No version info available.
pgvector: 0.3.6
psycopg: 3.2.5
psycopg-pool: 3.2.6
pydantic: 2.10.6
pydantic-settings<3.0.0,>=2.4.0: Installed. No version info available.
pydantic<3.0.0,>=2.5.2;: Installed. No version info available.
pydantic<3.0.0,>=2.7.4: Installed. No version info available.
pydantic<3.0.0,>=2.7.4;: Installed. No version info available.
pytest: 8.3.5
PyYAML>=5.3: Installed. No version info available.
requests: 2.32.3
requests-toolbelt: 1.0.0
requests<3,>=2: Installed. No version info available.
rich: Installed. No version info available.
sentence-transformers: 3.4.1
sqlalchemy: 2.0.38
SQLAlchemy<3,>=1.4: Installed. No version info available.
tenacity!=8.4.0,<10,>=8.1.0: Installed. No version info available.
tenacity!=8.4.0,<10.0.0,>=8.1.0: Installed. No version info available.
tokenizers: 0.21.0
transformers: 4.49.0
typing-extensions>=4.7: Installed. No version info available.
zstandard: 0.23.0

Metadata

Metadata

Assignees

No one assigned

    Labels

    Ɑ: coreRelated to langchain-core

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions