Skip to content

--cache-ram exception #10631

@alexheretic

Description

@alexheretic

Custom Node Testing

Expected Behavior

--cache-ram 32 should work.

This relates to #10454 (cc @rattus128)

Actual Behavior

An exception is encountered Exception in thread Thread-2 (prompt_worker) when running a wan workflow.

Steps to Reproduce

Start with --cache-ram 32 & run a wan 14B workflow.

Debug Logs

Total VRAM 16368 MB, total RAM 64218 MB                                                                                                                                   
pytorch version: 2.9.0+rocm6.4                                                                                                                                            
AMD arch: gfx1100                                                                                                                                                         
ROCm version: (6, 4)                                                                                                                                                      
Set vram state to: NORMAL_VRAM                                                                                                                                            
Device: cuda:0 AMD Radeon RX 7900 GRE : native                                                                                                                            
Using Flash Attention                                                                                                                                                     
Python version: 3.13.7 (main, Aug 15 2025, 12:34:02) [GCC 15.2.1 20250813]                                                                                                
ComfyUI version: 0.3.67                                                                                                                                                   
ComfyUI frontend version: 1.28.8

...

Exception in thread Thread-2 (prompt_worker):                                        
Traceback (most recent call last):                                                   
  File /usr/lib/python3.13/threading.py, line 1043, in _bootstrap_inner            
    self.run()                                                                       
    ~~~~~~~~^^                                                                       
  File /usr/lib/python3.13/threading.py, line 994, in run                          
    self._target(*self._args, **self._kwargs)                                        
    ~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^                                        
  File /home/alex/ComfyUI/main.py, line 202, in prompt_worker                     
    e.execute(item[2], prompt_id, extra_data, item[4])                               
    ~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^                               
  File /home/alex/ComfyUI/execution.py, line 664, in execute                      
    asyncio.run(self.execute_async(prompt, prompt_id, extra_data, execute_outputs))  
    ~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^  
  File /usr/lib/python3.13/asyncio/runners.py, line 195, in run                    
    return runner.run(main)                                                          
           ~~~~~~~~~~^^^^^^                                                          
  File /usr/lib/python3.13/asyncio/runners.py, line 118, in run                                                                                                         
    return self._loop.run_until_complete(task)                                                                                                                            
           ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^                                                                                                                            
  File /usr/lib/python3.13/asyncio/base_events.py, line 725, in run_until_complete                                                                                      
    return future.result()                                                                                                                                                
           ~~~~~~~~~~~~~^^                                                                                                                                                
  File /home/alex/ComfyUI/execution.py, line 720, in execute_async                                                                                                     
    self.caches.outputs.poll(ram_headroom=self.cache_args[ram])                                                                                                         
    ~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^                                                                                                         
  File /home/alex/ComfyUI/comfy_execution/caching.py, line 411, in poll                                                                                                
    scan_list_for_ram_usage(outputs)                                                                                                                                      
    ~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^                                                                                                                                      
  File /home/alex/ComfyUI/comfy_execution/caching.py, line 402, in scan_list_for_ram_usage
    for output in outputs:                                                           
                  ^^^^^^^                                                            
TypeError: 'NoneType' object is not iterable

Other

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    Potential BugUser is reporting a bug. This should be tested.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions