Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SO WIN10] [Llama3.2-11B-Vision-Instruct ] "llama stack run" ModuleNotFoundError: No module named 'termios' #186

Open
RookieReverse opened this issue Oct 23, 2024 · 2 comments

Comments

@RookieReverse
Copy link

Executing : llama stack run C:\Users\central.llama\checkpoints\Llama3.2-11B-Vision-Instruct\config.yml

Traceback (most recent call last):
File "", line 198, in _run_module_as_main
File "", line 88, in run_code
File "C:\WINDOWS\system32\llama-env\Scripts\llama.exe_main
.py", line 7, in
File "C:\WINDOWS\system32\llama-env\Lib\site-packages\llama_stack\cli\llama.py", line 44, in main
parser.run(args)
File "C:\WINDOWS\system32\llama-env\Lib\site-packages\llama_stack\cli\llama.py", line 38, in run
args.func(args)
File "C:\WINDOWS\system32\llama-env\Lib\site-packages\llama_stack\cli\stack\run.py", line 50, in _run_stack_run_cmd
from llama_stack.distribution.build import ImageType
File "C:\WINDOWS\system32\llama-env\Lib\site-packages\llama_stack\distribution\build.py", line 12, in
from llama_stack.distribution.utils.exec import run_with_pty
File "C:\WINDOWS\system32\llama-env\Lib\site-packages\llama_stack\distribution\utils\exec.py", line 9, in
import pty
File "C:\Program Files\Python\Lib\pty.py", line 12, in
import tty
File "C:\Program Files\Python\Lib\tty.py", line 5, in
from termios import *
ModuleNotFoundError: No module named 'termios'

package installed:

C:\Users\central>pip install windows-curses
Defaulting to user installation because normal site-packages is not writeable
Requirement already satisfied: windows-curses in c:\program files\python\lib\site-packages (2.4.0)

C:\Users\central.llama\checkpoints>pip show llama-stack
Name: llama_stack
Version: 0.0.43
Summary: Llama Stack
Home-page: https://github.com/meta-llama/llama-stack
Author: Meta Llama
Author-email: [email protected]
License:
Location: C:\Program Files\Python\Lib\site-packages
Requires: blobfile, fire, httpx, huggingface-hub, llama-models, prompt-toolkit, pydantic, python-dotenv, requests, rich, setuptools, termcolor
Required-by:


Cannot run llama-stack in windows

@RookieReverse
Copy link
Author

Installing pyreadline is not working as well

(llama-env) C:\WINDOWS\system32>pip install pyreadline
Collecting pyreadline
Downloading pyreadline-2.1.zip (109 kB)
---------------------------------------- 109.2/109.2 kB 791.6 kB/s eta 0:00:00
Installing build dependencies ... done
Getting requirements to build wheel ... done
Preparing metadata (pyproject.toml) ... done
Building wheels for collected packages: pyreadline
Building wheel for pyreadline (pyproject.toml) ... done
Created wheel for pyreadline: filename=pyreadline-2.1-py3-none-any.whl size=93855 sha256=5407c7ad21c5aae9476e939ebd33df5f5f54747f6dbde9ce1e78cc8737e7b8f2
Stored in directory: c:\users\central\appdata\local\pip\cache\wheels\17\3c\0e\500cdf5314c51559ab97f43fbb32fdfd1051a165fbca6552a2
Successfully built pyreadline
Installing collected packages: pyreadline
Successfully installed pyreadline-2.1

[notice] A new release of pip is available: 24.0 -> 24.2
[notice] To update, run: python.exe -m pip install --upgrade pip

_(llama-env) C:\WINDOWS\system32>llama stack run C:\Users\central.llama\checkpoints\Llama3.2-11B-Vision-Instruct\config.yml**_**

Traceback (most recent call last):
File "", line 198, in _run_module_as_main
File "", line 88, in run_code
File "C:\WINDOWS\system32\llama-env\Scripts\llama.exe_main
.py", line 7, in
File "C:\WINDOWS\system32\llama-env\Lib\site-packages\llama_stack\cli\llama.py", line 44, in main
parser.run(args)
File "C:\WINDOWS\system32\llama-env\Lib\site-packages\llama_stack\cli\llama.py", line 38, in run
args.func(args)
File "C:\WINDOWS\system32\llama-env\Lib\site-packages\llama_stack\cli\stack\run.py", line 50, in _run_stack_run_cmd
from llama_stack.distribution.build import ImageType
File "C:\WINDOWS\system32\llama-env\Lib\site-packages\llama_stack\distribution\build.py", line 12, in
from llama_stack.distribution.utils.exec import run_with_pty
File "C:\WINDOWS\system32\llama-env\Lib\site-packages\llama_stack\distribution\utils\exec.py", line 9, in
import pty
File "C:\Program Files\Python\Lib\pty.py", line 12, in
import tty
File "C:\Program Files\Python\Lib\tty.py", line 5, in
from termios import *
ModuleNotFoundError: No module named 'termios'

@ashwinb
Copy link
Contributor

ashwinb commented Oct 26, 2024

This has been a long standing concern. We will prioritize and address this more urgently in the next few days.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants