You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
when I use pdb.set_trace() in the latest code, it crashes.
How to use pdb in the latest vllm?
Process SpawnProcess-1:
Traceback (most recent call last):
File "/opt/vllm/vllm/worker/model_runner_base.py", line 116, in _wrapper
return func(*args, **kwargs)
File "/opt/vllm/vllm/worker/model_runner.py", line 1590, in execute_model
hidden_or_intermediate_states = model_executable(
File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 1553, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 1562, in _call_impl
return forward_call(*args, **kwargs)
File "/opt/vllm/vllm/model_executor/models/llama.py", line 479, in forward
model_output = self.model(input_ids, positions, kv_caches,
File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 1553, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 1562, in _call_impl
return forward_call(*args, **kwargs)
File "/opt/vllm/vllm/model_executor/models/llama.py", line 355, in forward
hidden_states, residual = layer(
File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 1553, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 1562, in _call_impl
return forward_call(*args, **kwargs)
File "/opt/vllm/vllm/model_executor/models/llama.py", line 268, in forward
hidden_states = self.self_attn(
File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 1553, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 1562, in _call_impl
return forward_call(*args, **kwargs)
File "/opt/vllm/vllm/model_executor/models/llama.py", line 191, in forward
q, k, v = qkv.split([self.q_size, self.kv_size, self.kv_size], dim=-1)
File "/opt/vllm/vllm/model_executor/models/llama.py", line 191, in forward
q, k, v = qkv.split([self.q_size, self.kv_size, self.kv_size], dim=-1)
File "/usr/lib/python3.10/bdb.py", line 90, in trace_dispatch
return self.dispatch_line(frame)
File "/usr/lib/python3.10/bdb.py", line 115, in dispatch_line
if self.quitting: raise BdbQuit
bdb.BdbQuit
Before submitting a new issue...
Make sure you already searched for relevant issues, and asked the chatbot living at the bottom right corner of the documentation page, which can answer lots of frequently asked questions.
The text was updated successfully, but these errors were encountered:
Your current environment
How would you like to use vllm
when I use
pdb.set_trace()
in the latest code, it crashes.How to use pdb in the latest vllm?
Before submitting a new issue...
The text was updated successfully, but these errors were encountered: