-
Notifications
You must be signed in to change notification settings - Fork 2k
Pr 5175 nowarmup verboseprompt #5733
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: micn/embedded-llama-server
Are you sure you want to change the base?
Pr 5175 nowarmup verboseprompt #5733
Conversation
…5666) Signed-off-by: Vincent Huang <[email protected]>
) Signed-off-by: Rizel Scarlett <[email protected]>
Signed-off-by: Abhijay007 <[email protected]>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Signed-off-by: Pedro Rodrigues <[email protected]>
Co-authored-by: Douwe Osinga <[email protected]>
Co-authored-by: Tania Chakraborty <[email protected]>
Co-authored-by: Copilot <[email protected]>
…lock#5693) Signed-off-by: Vincent Huang <[email protected]>
Co-authored-by: Angie Jones <[email protected]>
Signed-off-by: Cat Bilyeu <[email protected]> Co-authored-by: Claude <[email protected]> Co-authored-by: David Katz <[email protected]>
…into pr-5175-nowarmup-verboseprompt
Signed-off-by: ATrueLight4 <[email protected]>
|
so I am in a quandary... I think my PR for adding --verbose-prompt and --no-warmup should be closed and deleted... I noticed that the llama-embedded branch was updated and stdio and stderr are now going to null, so no need for --verbose or --verbose-prompt... And if I add the --no-warmup to a new PR, for not spawning a window for llama-server when running the GUI in Windows the PR I just created is no longer needed... Next I just tested not spawning a window, which never happened in CLI, and the change now no longer spawns a window for the llama-server in GUI under Windows and with the --no-warmup both CLI and GUI starts the larger gpt-oss in time to respond to first prompt given... Problem... GUI version of goose still doesn't kill the llama-server process at least under Windows. The CLI did, does, and still does, stop the llama-server when the goose CLI stops But with no window means user would need to open taskmanager or something to kill the llama-server process I can create a PR with this change to not spawn a window for the llama-server under Windows, but making sure the GUI version of goose kills the llama-server process should probably occur first... |
Summary
--no-warmup loads larger models faster, --verbose-prompt, see the actual prompts
Type of Change
AI Assistance
Testing
Manual
Related Issues
Relates to #5175
Discussion: LINK (if any)
Screenshots/Demos (for UX changes)
Before:
After:
Submitting a Recipe?
Email: