Skip to content

File extraction error: 'ChatOllama' object has no attribute 'model_name' #1285

Closed
@ram0x7bc

Description

@ram0x7bc

When processing an uploaded PDF in LLM Graph Builder version 0.8.1, the following occurs when using ollama on the backend. No documents can be processed:

16f72ce7654e 2025-05-13 03:06:44,087 - File Failed in extraction: 'ChatOllama' object has no attribute 'model_name'
16f72ce7654e Traceback (most recent call last):
16f72ce7654e   File "/code/score.py", line 243, in extract_knowledge_graph_from_file
16f72ce7654e     uri_latency, result = await extract_graph_from_file_local_file(uri, userName, password, database, model, merged_file_path, file_name, allowedNodes, allowedRelationship, token_chunk_size, chunk_overlap, chunks_to_combine, retry_condition, additional_instructions)
16f72ce7654e   File "/code/src/main.py", line 244, in extract_graph_from_file_local_file
16f72ce7654e     return await processing_source(uri, userName, password, database, model, fileName, [], allowedNodes, allowedRelationship, token_chunk_size, chunk_overlap, chunks_to_combine, True, merged_file_path, retry_condition, additional_instructions=additional_instructions)
16f72ce7654e   File "/code/src/main.py", line 389, in processing_source
16f72ce7654e     node_count,rel_count,latency_processed_chunk = await processing_chunks(selected_chunks,graph,uri, userName, password, database,file_name,model,allowedNodes,allowedRelationship,chunks_to_combine,node_count, rel_count, additional_instructions)
16f72ce7654e   File "/code/src/main.py", line 484, in processing_chunks
16f72ce7654e     graph_documents =  await get_graph_from_llm(model, chunkId_chunkDoc_list, allowedNodes, allowedRelationship, chunks_to_combine, additional_instructions)
16f72ce7654e   File "/code/src/llm.py", line 214, in get_graph_from_llm
16f72ce7654e     graph_document_list = await get_graph_document_list(
16f72ce7654e   File "/code/src/llm.py", line 184, in get_graph_document_list
16f72ce7654e     model_name = llm.model_name.lower()
16f72ce7654e   File "/usr/local/lib/python3.10/site-packages/pydantic/main.py", line 891, in __getattr__
16f72ce7654e     raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}')
16f72ce7654e AttributeError: 'ChatOllama' object has no attribute 'model_name'. Did you mean: 'model_dump'?

I ran the application outside of the container and added some addtional logging to get a better look at the flow, but it is pretty simple:

llm = ChatOllama(base_url=base_url, model=model_name)

The 'model' attribute is assigned and there is no 'model_name' attribute. Langchain documentation says to use the 'model' attribute.

Calling model_name = llm.model_name.lower() causes the error because it is acting on the 'model_name' attribute instead of 'model' attribute.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions