Experimenting with building a memory system for conversational ai
Requires gpt4all model in a models subfolder. Currently using a gpq4all-lora-unfiltered-quantized.bin model that has been converted.
You can obtain the unfiltered model from here:
You will need to convert it per the instructions found here: https://github.com/nomic-ai/pyllamacpp#gpt4all
Place the model in the models/
folder and give it the name gpt4all-lora-unfiltered-quantized-converted.bin