A simple demonstration of how to set up an AI agent using Hono and Vercel AI SDK with streaming support on Cloudflare Workers. This backend is designed to work with React frontend applications using the @ai-sdk/react library.
- Hono Framework: Lightweight web framework for Cloudflare Workers
- Vercel AI SDK: Streaming AI responses with Google Gemini
- React Compatible: Designed for use with
@ai-sdk/reactlibrary - Real-time Streaming: Support for streaming AI responses
GET /- Health check endpointPOST /chat- Chat endpoint for AI conversations
Send a POST request to /chat with the following structure:
{
"messages": [
{
"role": "user",
"content": "Hello, how are you?"
}
]
}The response will be a streaming data format compatible with Vercel AI SDK.
- Install dependencies:
pnpm install- Set up your environment variables:
# Add to your Cloudflare Workers environment or .dev.vars file
GOOGLE_GENERATIVE_AI_API_KEY=your_api_key_here- Run the development server:
pnpm devThis backend works seamlessly with React applications using @ai-sdk/react:
import { useChat } from '@ai-sdk/react';
export default function Chat() {
const { messages, input, handleInputChange, handleSubmit } = useChat({
api: 'http://localhost:8787/chat'
});
return (
<div>
{messages.map(m => (
<div key={m.id}>
{m.role}: {m.content}
</div>
))}
<form onSubmit={handleSubmit}>
<input
value={input}
onChange={handleInputChange}
placeholder="Say something..."
/>
</form>
</div>
);
}Deploy to Cloudflare Workers:
pnpm deploy