You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Sep 12, 2024. It is now read-only.
I use the Llama.cpp backend and tried to use the stopSequence parameter, which is a string, but it does not seem to work as expected. Maybe I am misusing it. Question: is this parameter supposed to work like the stop one in the Python Llama.cpp lib? It is and array of strings in this lib, quote from their doc:
A list of strings to stop generation when encountered.
Example: I have a prompt that ends like this:
### Response: ```json
and I would like the LM to stop inference when it produces the ` token. I tried to set the stopSequence parameter to the ` or ``` string but it does not stop the inference. How to use the stopSequence parameter?
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
I use the Llama.cpp backend and tried to use the
stopSequence
parameter, which is a string, but it does not seem to work as expected. Maybe I am misusing it. Question: is this parameter supposed to work like thestop
one in the Python Llama.cpp lib? It is and array of strings in this lib, quote from their doc:Example: I have a prompt that ends like this:
and I would like the LM to stop inference when it produces the ` token. I tried to set the
stopSequence
parameter to the ` or ``` string but it does not stop the inference. How to use thestopSequence
parameter?Beta Was this translation helpful? Give feedback.
All reactions