Runelite plugin integration with Ollama through Python Backend to respond to chat messages in Oldschool Runescape with Generative AI
chatpippreview.mp4
Requirements:
- Your custom Runelite plugin
- Your deployed Ollama model
- Run Python Backend
Open source Old School RuneScape client
https://github.com/runelite/runelite
Runelite Developer wiki: https://github.com/runelite/runelite/wiki
Set up your own plugin, then incorporate the codes from ExamplePlugin.java
, into your plugin.
Essentially, on chat message, sends POST request to Python backend.
Free, no api needed. unlimited calls. Runs locally which consumes computing resources though, and cannot compete with extremely large language models like GPT-4. But Llama3 works pretty well.
https://ollama.com/
https://github.com/ollama/ollama
Set up your own model. Python Backend calls llama3 model, but you may use other models available.
This python backend app.py
opens a Flask server and accepts the POST request from the Runelite plugin and sends it over to Ollama for a response.
After receiving a response, pyautogui library types the response out.
There is a queue implemented, and the message will only be processed when it is in the front of the queue.
Prompt used: Reply in a cute uwu way. Give short response.
Modify model = "llama3"
if needed.
Modify if fc == "Friend Chat":
and elif fc == "Clan Chat"
to the name of your chats correspondingly.