LLaMa is kinda insane #215
Mr-NI
started this conversation in
Show and tell
Replies: 2 comments 2 replies
-
Yes, llama was not finetuned and contains all kinds of hostile and nsfw content scraped from the web. |
Beta Was this translation helpful? Give feedback.
2 replies
-
You can run a fine tuned 7B model with llama.cpp already using https://github.com/tloen/alpaca-lora. Follow the install instructions and run export_state_dict_checkpoint.py, you will get a .pth that can be converted and quantized with llama.cpp. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
is this normal?
Beta Was this translation helpful? Give feedback.
All reactions