Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Executing llama on Windows causes a ModuleNotFoundError for termios #726

Open
1 of 2 tasks
briancabbott opened this issue Jan 5, 2025 · 3 comments
Open
1 of 2 tasks

Comments

@briancabbott
Copy link

System Info

:128: RuntimeWarning: 'torch.utils.collect_env' found in sys.modules after import of package 'torch.utils', but prior to execution of 'torch.utils.collect_env'; this may result in unpredictable behaviour
Collecting environment information...
PyTorch version: 2.5.1+cu118
Is debug build: False
CUDA used to build PyTorch: 11.8
ROCM used to build PyTorch: N/A

OS: Microsoft Windows 11 Home
GCC version: Could not collect
Clang version: Could not collect
CMake version: version 3.30.2
Libc version: N/A

Python version: 3.12.8 (tags/v3.12.8:2dc476b, Dec 3 2024, 19:30:04) [MSC v.1942 64 bit (AMD64)] (64-bit runtime)
Python platform: Windows-11-10.0.22631-SP0
Is CUDA available: True
CUDA runtime version: Could not collect
CUDA_MODULE_LOADING set to: LAZY
GPU models and configuration: GPU 0: NVIDIA GeForce RTX 3050 6GB Laptop GPU
Nvidia driver version: 561.00
cuDNN version: Could not collect
HIP runtime version: N/A
MIOpen runtime version: N/A
Is XNNPACK available: True

CPU:
Architecture=9
CurrentClockSpeed=2300
DeviceID=CPU0
Family=1
L2CacheSize=18432
L2CacheSpeed=
Manufacturer=GenuineIntel
MaxClockSpeed=2500
Name=Intel(R) Core(TM) Ultra 9 185H
ProcessorType=3
Revision=

Versions of relevant libraries:
[pip3] numpy==2.2.1
[pip3] torch==2.5.1+cu118
[pip3] torchaudio==2.5.1+cu118
[pip3] torchvision==0.20.1+cu118
[conda] Could not collect

Information

  • The official example scripts
  • My own modified scripts

🐛 Describe the bug

When executing the llama command, a ModuleNotFoundError is thrown for termios.
Termios is only available for Linux and, msvcrt should be used on Windows Systems.

Error logs

Traceback (most recent call last):
File "", line 198, in run_module_as_main
File "", line 88, in run_code
File "C:\Users\brian\python\Scripts\llama.exe_main
.py", line 4, in
File "C:\Users\brian\dev_space\Agentic Capital\llama-stack-orig\llama_stack_init
.py", line 7, in
from llama_stack.distribution.library_client import ( # noqa: F401
File "C:\Users\brian\dev_space\Agentic Capital\llama-stack-orig\llama_stack\distribution\library_client.py", line 34, in
from llama_stack.distribution.build import print_pip_install_help
File "C:\Users\brian\dev_space\Agentic Capital\llama-stack-orig\llama_stack\distribution\build.py", line 23, in
from llama_stack.distribution.utils.exec import run_with_pty
File "C:\Users\brian\dev_space\Agentic Capital\llama-stack-orig\llama_stack\distribution\utils\exec.py", line 10, in
import pty
File "C:\Users\brian\python\Lib\pty.py", line 12, in
import tty
File "C:\Users\brian\python\Lib\tty.py", line 5, in
from termios import *
ModuleNotFoundError: No module named 'termios'

Expected behavior

llama stack command executes normally with the basic help text displayed:

PS C:\Users\brian\dev_space\Agentic Capital\llama-stack> llama stack
usage: llama [-h] {model,stack,download,verify-download} ...

Welcome to the Llama CLI

options:
-h, --help show this help message and exit

subcommands:
{model,stack,download,verify-download}
PS C:\Users\brian\dev_space\Agentic Capital\llama-stack>

@abhishek-syno
Copy link

I encountered the same issue; the old llama config YAML was causing problems. You can check the latest sample file for the required parameters.

@briancabbott
Copy link
Author

I encountered the same issue; the old llama config YAML was causing problems. You can check the latest sample file for the required parameters.

So, its a non issue or? I committed a resolution.

@0-Vanes-0
Copy link

I followed the official installation instructions and encountered the same.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants