Skip to content

ConduitIO/conduit-ai-pipelines

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Conduit AI Pipelines

Real-world examples of AI-powered data pipelines built with Conduit - demonstrating how to integrate modern AI services into production data workflows.

Overview

This repository showcases practical AI pipeline implementations using Conduit's data streaming capabilities. Each example demonstrates different aspects of building intelligent, real-time data processing systems that integrate with popular AI services like OpenAI, cloud storage, and vector databases.

Examples

Real-time customer support automation

  • Source: PostgreSQL (support tickets)
  • AI Processing: OpenAI GPT-4 summarization
  • Destination: Slack webhooks
  • Use Case: Automatically summarize and notify teams of new support tickets

Real-time customer feedback monitoring

  • Source: PostgreSQL (customer reviews)
  • AI Processing: OpenAI GPT-4 sentiment classification
  • Destination: Slack webhooks
  • Use Case: Automatically analyze and notify teams of customer review sentiment

Real-time ingest document processing and intelligent search

  • Source: AWS S3 (documents)
  • AI Processing: Document parsing + OpenAI embeddings + vector search
  • Destination: PostgreSQL with pgvector
  • Use Case: Build searchable knowledge bases from document collections
  • Includes: Document parsing service + RAG query API

Real-time ingest document processing and chatbot UI running locally

  • Source: AWS S3 (documents)
  • AI Processing: Document parsing + Ollama embeddings + vector search
  • Destination: Supabase
  • Use Case: Build a chatbot UI with knowledge bases from document collections using local and open source software
  • Includes: Document parsing service

Quick Start

  1. Install Conduit

    # Download latest release
    curl -sSL https://get.conduit.io | sh
    
    # Or use Homebrew
    brew install conduit
  2. Choose an Example

    cd examples/summarize 
  3. Follow Setup Instructions Each example includes detailed setup instructions and environment configuration in the respective README

Prerequisites

  • Conduit (latest version)
  • OpenAI API key for AI processing
  • Database access (PostgreSQL recommended)
  • Cloud storage credentials (AWS S3 for RAG example)

Features Demonstrated

  • AI Integration: OpenAI GPT models, embeddings, and text processing
  • Real-time Processing: Stream processing with immediate AI-powered transformations
  • Vector Databases: pgvector integration for similarity search and retrieval
  • API Orchestration: HTTP processors for external service integration
  • Data Transformation: Complex multi-step processing workflows
  • Custom Processing: JavaScript processors for domain-specific logic
  • Multiple Connectors: S3, PostgreSQL, HTTP webhooks, and more

Getting Help

Related Projects

License

This project is licensed under the Apache License 2.0 - see the LICENSE file for details.

About

Real-world examples of AI-powered data pipelines built with Conduit

Resources

License

Code of conduct

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •