Skip to content
Merged

V5 #1

Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
25 changes: 25 additions & 0 deletions AGENTS.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
# Repository Guidelines

## Project Structure & Module Organization

Core provider code lives in `src/`, with `src/models` covering chat and embedding adapters and `src/utils` housing shared helpers. Jest suites reside in `tests/`, including `tests/setup.ts` for shared configuration. Example agents and integration playgrounds are under `examples/`, while generated TypeDoc content is stored in `docs/`. Build artifacts land in `dist/`; regenerate them via the build pipeline instead of editing them directly. Supporting scripts (such as CommonJS fixes) are in `scripts/`.

## Build, Test, and Development Commands

Install dependencies with `pnpm install`. Use `pnpm build` to clean and emit both ESM and CJS bundles, and `pnpm docs` to refresh the API reference. Run `pnpm test` for the default Jest suite, `pnpm test:watch` during active development, and `pnpm test:coverage` before releases. `pnpm lint` checks the codebase with ESLint, while `pnpm format` applies Prettier. Try `pnpm example:tool-loop` to exercise the agent tool-call loop locally.

## Coding Style & Naming Conventions

This project targets modern TypeScript and enforces 2-space indentation via Prettier and ESLint (`eslint.config.mjs`). Prefer named exports from modules and keep filenames lowercase with hyphens or descriptive nouns (e.g., `chat.ts`). Use `camelCase` for functions and variables, `PascalCase` for types and classes, and reserve UPPER_SNAKE_CASE for environment constants. Avoid ambient `any`; annotate public APIs explicitly so generated declarations stay accurate.

## Testing Guidelines

All automated tests run through Jest with `ts-jest`. Place new specs alongside peers in `tests/**` using the `*.test.ts` suffix and mirror the source folder structure for clarity. Reuse helpers from `tests/setup.ts` when configuring shared mocks. Maintain or improve coverage when touching public features—verify locally with `pnpm test:coverage` and include regression cases for bug fixes.

## Commit & Pull Request Guidelines

Follow Conventional Commit prefixes (`feat:`, `fix:`, `build:`) as seen in the existing history to keep automated checks healthy. Before opening a PR, run `pnpm lint` and `pnpm test` so Husky hooks pass cleanly. PRs should summarize the change, reference related issues, and attach logs or screenshots for behavioral updates. Call out breaking changes or configuration impacts explicitly to aid downstream consumers.

## Security & Configuration Tips

Never commit secrets; load keys such as `INFERENCE_KEY` and `EMBEDDING_KEY` via `.env` or your shell and document placeholders in examples. When sharing reproduction steps, redact tokens and avoid echoing full request bodies that include credentials. For manual testing, prefer scoped API keys and revoke them once the investigation wraps.
126 changes: 57 additions & 69 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -41,26 +41,31 @@ Set your Heroku AI API keys as environment variables:

```bash
# For chat completions
HEROKU_INFERENCE_KEY=your_inference_api_key
INFERENCE_KEY=your_inference_api_key

# For embeddings
HEROKU_EMBEDDING_KEY=your_embedding_api_key
EMBEDDING_KEY=your_embedding_api_key

# Optional: Custom API endpoints
HEROKU_INFERENCE_URL=https://us.inference.heroku.com
HEROKU_EMBEDDING_URL=https://us.inference.heroku.com
INFERENCE_URL=https://us.inference.heroku.com
EMBEDDING_URL=https://us.inference.heroku.com

```

### Basic Configuration

```typescript
import { createHerokuProvider } from "heroku-ai-provider";
import { heroku } from "heroku-ai-provider";

const model = heroku.chat("claude-4-sonnet");
```

#### Custom Configuration

// Using environment variables (recommended)
const heroku = createHerokuProvider();
```typescript
import { createHerokuAI } from "heroku-ai-provider";

// Or with explicit configuration
const heroku = createHerokuProvider({
const client = createHerokuAI({
chatApiKey: "your_inference_api_key",
embeddingsApiKey: "your_embedding_api_key",
chatBaseUrl: "https://us.inference.heroku.com/v1/chat/completions",
Expand All @@ -76,12 +81,10 @@ const heroku = createHerokuProvider({

```typescript
import { generateText } from "ai";
import { createHerokuProvider } from "heroku-ai-provider";

const heroku = createHerokuProvider();
import { heroku } from "heroku-ai-provider";

const { text } = await generateText({
model: heroku.chat("claude-3-5-sonnet-latest"),
model: heroku.chat("claude-4-sonnet"),
prompt: "What is the capital of France?",
});

Expand All @@ -91,10 +94,8 @@ console.log(text); // "The capital of France is Paris."
#### Streaming Chat

```typescript
import { streamText } from "ai";
import { createHerokuProvider } from "heroku-ai-provider";

const heroku = createHerokuProvider();
import { streamText, stepCountIs } from "ai";
import { heroku } from "heroku-ai-provider";

const { textStream } = await streamText({
model: heroku.chat("claude-3-haiku"),
Expand All @@ -110,12 +111,10 @@ for await (const delta of textStream) {

```typescript
import { generateText } from "ai";
import { createHerokuProvider } from "heroku-ai-provider";

const heroku = createHerokuProvider();
import { heroku } from "heroku-ai-provider";

const { text } = await generateText({
model: heroku.chat("claude-3-5-sonnet-latest"),
model: heroku.chat("claude-4-sonnet"),
system: "You are a helpful assistant that explains complex topics simply.",
prompt: "Explain quantum computing",
});
Expand All @@ -124,14 +123,12 @@ const { text } = await generateText({
### Tool/Function Calling

```typescript
import { generateText, tool } from "ai";
import { createHerokuProvider } from "heroku-ai-provider";
import { generateText, tool, stepCountIs } from "ai";
import { heroku } from "heroku-ai-provider";
import { z } from "zod";

const heroku = createHerokuProvider();

const { text } = await generateText({
model: heroku.chat("claude-3-5-sonnet-latest"),
model: heroku.chat("claude-4-sonnet"),
prompt: "What is the weather like in New York?",
tools: {
getWeather: tool({
Expand All @@ -149,21 +146,19 @@ const { text } = await generateText({
},
}),
},
maxSteps: 5, // Allow multi-step tool conversations
stopWhen: stepCountIs(5),
});
```

#### Advanced Tool Usage with Multiple Steps

```typescript
import { generateText, tool } from "ai";
import { createHerokuProvider } from "heroku-ai-provider";
import { generateText, tool, stepCountIs } from "ai";
import { heroku } from "heroku-ai-provider";
import { z } from "zod";

const heroku = createHerokuProvider();

const { text, steps } = await generateText({
model: heroku.chat("claude-3-5-sonnet-latest"),
model: heroku.chat("claude-4-sonnet"),
prompt:
"Check the weather in New York and then suggest appropriate clothing.",
tools: {
Expand All @@ -183,7 +178,7 @@ const { text, steps } = await generateText({
}),
suggestClothing: tool({
description: "Suggest appropriate clothing based on weather conditions",
parameters: z.object({
inputSchema: z.object({
temperature: z.number().describe("Temperature in Fahrenheit"),
condition: z.string().describe("Weather condition"),
humidity: z.number().optional().describe("Humidity percentage"),
Expand All @@ -201,7 +196,7 @@ const { text, steps } = await generateText({
},
}),
},
maxSteps: 5,
stopWhen: stepCountIs(5),
});

console.log("Final response:", text);
Expand All @@ -214,9 +209,7 @@ console.log("Tool execution steps:", steps.length);

```typescript
import { embed } from "ai";
import { createHerokuProvider } from "heroku-ai-provider";

const heroku = createHerokuProvider();
import { heroku } from "heroku-ai-provider";

const { embedding } = await embed({
model: heroku.embedding("cohere-embed-multilingual"),
Expand All @@ -230,9 +223,7 @@ console.log(embedding); // [0.1, 0.2, -0.3, ...]

```typescript
import { embedMany } from "ai";
import { createHerokuProvider } from "heroku-ai-provider";

const heroku = createHerokuProvider();
import { heroku } from "heroku-ai-provider";

const { embeddings } = await embedMany({
model: heroku.embedding("cohere-embed-multilingual"),
Expand All @@ -249,7 +240,7 @@ import { createEmbedFunction } from "heroku-ai-provider";

// Create a reusable embed function
const embedText = createEmbedFunction({
apiKey: process.env.HEROKU_EMBEDDING_KEY!,
apiKey: process.env.EMBEDDING_KEY!,
model: "cohere-embed-multilingual",
});

Expand All @@ -264,24 +255,27 @@ console.log(embedding); // [0.1, 0.2, -0.3, ...]
```typescript
interface HerokuProviderSettings {
// API keys (falls back to environment variables)
chatApiKey?: string; // HEROKU_INFERENCE_KEY
embeddingsApiKey?: string; // HEROKU_EMBEDDING_KEY
chatApiKey?: string; // INFERENCE_KEY
embeddingsApiKey?: string; // EMBEDDING_KEY

// Base URLs (falls back to environment variables or defaults)
chatBaseUrl?: string; // HEROKU_INFERENCE_URL
embeddingsBaseUrl?: string; // HEROKU_EMBEDDING_URL
chatBaseUrl?: string; // INFERENCE_URL
embeddingsBaseUrl?: string; // EMBEDDING_URL
}
```

### Supported Models

#### Chat Models

- `claude-3-5-sonnet-latest` - Latest Claude 3.5 Sonnet (recommended)
- `claude-3-haiku` - Fast and efficient Claude 3 Haiku
- `claude-4-sonnet` - Claude 4 Sonnet (when available)
- `claude-3-7-sonnet` - Claude 3.7 Sonnet
- `claude-3-5-haiku` - Claude 3.5 Haiku
- `claude-4-sonnet` - Latest Claude 4 Sonnet by Anthropic
- `claude-3-haiku` - Claude 3 Haiku by Anthropic
- `claude-3-7-sonnet` - Claude 3.7 Sonnet by Anthropic
- `claude-3-5-haiku` - Claude 3.5 Haiku by Anthropic
- `claude-3-5-sonnet-latest` - Claude 3.5 Sonnet by Anthropic
- `gpt-oss-120b` - gpt-oss-120b by OpenAI
- `nova-lite` - Nova Lite by Amazon
- `nova-pro` - Nova Pro by Amazon

#### Embedding Models

Expand All @@ -293,18 +287,16 @@ interface HerokuProviderSettings {

```typescript
// app/api/chat/route.ts
import { streamText } from "ai";
import { createHerokuProvider } from "heroku-ai-provider";

const heroku = createHerokuProvider();
import { streamText, stepCountIs } from "ai";
import { heroku } from "heroku-ai-provider";

export async function POST(req: Request) {
const { messages } = await req.json();

const result = await streamText({
model: heroku.chat("claude-3-5-sonnet-latest"),
model: heroku.chat("claude-4-sonnet"),
messages,
maxSteps: 5, // Enable multi-step tool conversations
stopWhen: stepCountIs(5), // Enable multi-step tool conversations
});

return result.toDataStreamResponse();
Expand All @@ -316,21 +308,19 @@ export async function POST(req: Request) {
```typescript
// app/api/chat/route.ts
import { streamText, tool } from "ai";
import { createHerokuProvider } from "heroku-ai-provider";
import { heroku } from "heroku-ai-provider";
import { z } from "zod";

const heroku = createHerokuProvider();

export async function POST(req: Request) {
const { messages } = await req.json();

const result = await streamText({
model: heroku.chat("claude-3-5-sonnet-latest"),
model: heroku.chat("claude-4-sonnet"),
messages,
tools: {
getTime: tool({
description: "Get the current time",
parameters: z.object({
inputSchema: z.object({
timezone: z
.string()
.optional()
Expand All @@ -344,7 +334,7 @@ export async function POST(req: Request) {
},
}),
},
maxSteps: 5,
stopWhen: stepCountIs(5),
});

return result.toDataStreamResponse();
Expand All @@ -356,10 +346,9 @@ export async function POST(req: Request) {
```typescript
import express from "express";
import { generateText } from "ai";
import { createHerokuProvider } from "heroku-ai-provider";
import { heroku } from "heroku-ai-provider";

const app = express();
const heroku = createHerokuProvider();

app.post("/chat", async (req, res) => {
const { prompt } = req.body;
Expand All @@ -379,15 +368,14 @@ The provider includes comprehensive error handling with user-friendly error mess

```typescript
import {
createHerokuProvider,
createHerokuAI,
isConfigurationError,
isTemporaryServiceError,
} from "heroku-ai-provider";

try {
const heroku = createHerokuProvider();
const result = await generateText({
model: heroku.chat("claude-3-5-sonnet-latest"),
model: heroku.chat("claude-4-sonnet"),
prompt: "Hello!",
});
} catch (error) {
Expand All @@ -410,7 +398,7 @@ try {
#### Authentication Errors

- **Issue**: "Chat API key is required" or "Embeddings API key is required"
- **Solution**: Ensure your API keys are set in environment variables or passed directly to `createHerokuProvider()`
- **Solution**: Ensure your API keys are set in environment variables or passed directly to `createHerokuAI()`

#### Model Not Found

Expand All @@ -430,7 +418,7 @@ try {
#### Tool Execution Issues

- **Issue**: Tools are called but AI doesn't provide final response
- **Solution**: Ensure you're using `maxSteps: 5` or higher to allow multi-step tool conversations
- **Solution**: Ensure you configure `stopWhen` (for example, `stopWhen: stepCountIs(5)`) so the model can complete multi-step tool conversations

#### Schema Validation Errors

Expand Down
Loading
Loading