Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[jinja] GPT4All 3.6.1 TheBloke/Wizard-Vicuna-13B-Uncensored-GGUF chat template issues #3365

Open
ThiloteE opened this issue Jan 5, 2025 · 0 comments
Labels

Comments

@ThiloteE
Copy link
Collaborator

ThiloteE commented Jan 5, 2025

Meta-issue: #3340

Bug Report

Model does not work out of the box

Steps to Reproduce

  1. Download the gguf
  2. sideload it in GPT4All-Chat
  3. start chatting

Expected Behavior

Model works out of the box.

Your Environment

Jinja template out of the box in GPT4All-Chat 3.6.1 (does not work)

The model does not feature a jinja chat template in its tokenizer_config.json

Jinja template made by ThiloteE (maybe works?)

{{- bos_token }}
{%- for message in messages %}
    {{- message['role'] + message['content'] }}
{%- endfor %}
{%- if add_generation_prompt %}
    {{- 'ASSISTANT:' }}
{%- else %}
    {{- eos_token }}
{%- endif %}

Recommended system prompt:

The model apparently was finetuned on this system prompt: A chat between a curious user and an artificial intelligence assistant. The assistant gives helpful, detailed, and polite answers to the user's questions.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

1 participant