Skip to content

This repository contains two examples demonstrating how to integrate a **Gaia Node** (providing an OpenAI-compatible API for local LLMs) with **Redis** to enhance your AI applications.

Notifications You must be signed in to change notification settings

harishkotra/gaia-redis

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Redis for AI with Gaia Node Examples

This repository contains two examples demonstrating how to integrate a Gaia Node (providing an OpenAI-compatible API for local LLMs) with Redis to enhance your AI applications.

Prerequisites

  • Gaia Node: You must have a Gaia Node running locally or accessible over the network. Note down its API URL (e.g., http://localhost:8080/v1), the model name it serves (e.g., llama3b), and generate an API key.
  • Redis: You need a Redis server running. Standard Redis Open Source is sufficient for these examples.
  • Python Environment: Ensure you have Python installed. It's recommended to use a virtual environment.

Setup

  1. Clone or Download: Obtain the example files (basic.py, intermediate.py) and the requirements.txt file.
  2. Create .env File: In the directory with your scripts, create a file named .env and add your Gaia Node credentials:
    GAIA_NODE_URL=http://your-gaia-node-address:port/v1 # e.g., http://localhost:8080/v1
    GAIA_API_KEY=your_actual_gaia_api_key
    GAIA_MODEL_NAME=your_model_name # e.g., llama3b
    # REDIS_URL=redis://localhost:6379 # Optional, defaults to local Redis
  3. Install Dependencies: Install the required Python packages: pip install -r requirements.txt
  4. Run an Example: Execute either Streamlit application:
    • streamlit run basic.py
    • streamlit run intermediate.py

Example 1: Basic - Gaia Node & Redis Integration

What it does:

This example sets up the fundamental connection between your application, a Gaia Node, and Redis.

  • Gaia Node: Sends chat messages to your Gaia Node using the langchain-openai library and displays the AI's response.
  • Redis: Demonstrates simple storing and retrieving of data (like a message or setting) using Redis as a fast key-value store.
  • UI: Provides a chat interface for the Gaia Node and a simple input section in the sidebar to interact with Redis.

Key Concepts:

  • Connecting to your Gaia Node's API.
  • Connecting to a Redis instance.
  • Performing basic SET and GET operations in Redis.

Example 2: Intermediate - Caching LLM Responses with Redis

What it does:

This example builds on the basic one by adding a powerful feature: caching. It prevents the application from asking the Gaia Node the same question repeatedly by storing previous answers in Redis.

  • Gaia Node: The chat functionality remains, but now the application checks Redis before asking the Gaia Node a question.
  • Redis (Caching):
    • Check: Before calling the Gaia Node, it calculates a unique key for the current question/conversation history and checks if an answer for this key exists in Redis.
    • Cache Hit: If found, it displays the answer directly from Redis, skipping the Gaia Node call (much faster!).
    • Cache Miss: If not found, it calls the Gaia Node, gets the answer, stores the answer in Redis under the calculated key (for future use), and then displays it.
  • UI: The chat works the same, but now it will indicate if a response came from the cache. It also shows how many items are currently cached in Redis.

Key Concepts:

  • Using Redis as a fast cache to store and retrieve LLM responses.
  • Improving application speed and reducing load on the Gaia Node by avoiding duplicate requests.
  • Generating unique keys for cache entries based on input.

How They Work Together

Redis acts as a high-speed data layer alongside your Gaia Node. The basic example shows simple data storage, while the intermediate example shows how this fast storage can be used strategically (caching) to make your AI application more efficient and responsive.

About

This repository contains two examples demonstrating how to integrate a **Gaia Node** (providing an OpenAI-compatible API for local LLMs) with **Redis** to enhance your AI applications.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages