How do I use the system prompt and prompt template effectively? #2037
Replies: 4 comments 2 replies
-
I'll assume you're using the GPT4All Chat UI and not the bindings. It depends on the model you are using. If you got it from TheBloke, his README will have an example of what the prompt template (and system prompt, if applicable) are supposed to look like. ChatML and similar formats should generally only be used with models that have the required special tokens in their vocabulary; otherwise, they will indeed look like plain text to the model. A recent change as of v2.7.1 is that prompt templates should contain
|
Beta Was this translation helpful? Give feedback.
-
Ah, got it. Which models are from TheBloke though? I'm using Hermes 13b for this particular instance. |
Beta Was this translation helpful? Give feedback.
-
how to use a prompt template from python? |
Beta Was this translation helpful? Give feedback.
-
the template uses jinja now. what is an example template? |
Beta Was this translation helpful? Give feedback.
-
I hope this is the right place to ask for this.
What is the specific syntax or markup language that is used in these fields? And what's the best practice for what to put in these?
I have made an attempt using the Llama-2 prompt format and the ChatML markup, but both seem to be parsed by the model as plaintext and not a type of special instruction, or perhaps I'm missing something?
Beta Was this translation helpful? Give feedback.
All reactions