Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GPT4All V3.6.1->Jinia error #3348

Open
1015g opened this issue Dec 22, 2024 · 3 comments
Open

GPT4All V3.6.1->Jinia error #3348

1015g opened this issue Dec 22, 2024 · 3 comments
Labels
bug-unconfirmed chat gpt4all-chat issues

Comments

@1015g
Copy link

1015g commented Dec 22, 2024

jq -r ".chat_template" tokenizer_config.json

Jinja out of the box in GPT4All:

{% for message in messages %}{{'<|im_start|>' + message['role'] + '
' + message['content']}}{% if not loop.last or (loop.last and message['role'] != 'assistant') %}{{'<|im_end|>
'}}{% endif %}{% endfor %}{% if add_generation_prompt and messages[-1]['role'] != 'assistant' %}{{ '<|im_start|>assistant
' }}{% endif %}

Image
(#3340)

@1015g 1015g added bug-unconfirmed chat gpt4all-chat issues labels Dec 22, 2024
@1015g
Copy link
Author

1015g commented Dec 22, 2024

Sorry, I don't know how to modify it according to the prompts. Because I don't understand anything.And that's just one of my many local models that have problems., which is terrible! Can't believe this could be an updated change!

@ThiloteE
Copy link
Collaborator

ThiloteE commented Dec 23, 2024

As long as you do not provide a link to the model you are using, nobody can reproduce and test an improved chat template, but simply by looking at the jinja, the following could work. Please try:

{%- for message in messages %}
    {{- '<|im_start|>' + message['role'] + '\n' + message['content'] }}
    {%- if not loop.last or (loop.last and message['role'] != 'assistant') %}
        {{- '<|im_end|>\n' }}
    {%- endif %}
{%- endfor %}
{%- if add_generation_prompt and messages[-1]['role'] != 'assistant' %}
    {{- '<|im_start|>assistant\n' }}
{%- endif %}

@1015g
Copy link
Author

1015g commented Dec 23, 2024

As long as you do not provide a link to the model you are using, nobody can reproduce and test an improved chat template, but simply by looking at the jinja, the following could work. Please try:

{%- for message in messages %}
{{- '<|im_start|>' + message['role'] + '\n' + message['content'] }}
{%- if not loop.last or (loop.last and message['role'] != 'assistant') %}
{{- '<|im_end|>\n' }}
{%- endif %}
{%- endfor %}
{%- if add_generation_prompt and messages[-1]['role'] != 'assistant' %}
{{- '<|im_start|>assistant\n' }}
{%- endif %}

Thank you very much for your help! Here is the model link:https://huggingface.co/tastypear/CausalLM-7B-DPO-alpha-GGUF

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug-unconfirmed chat gpt4all-chat issues
Projects
None yet
Development

No branches or pull requests

2 participants