A flexible and type-safe execution engine for JSON-RPC based workflows. This engine allows you to define complex flows of operations including requests, transformations, conditions, and loops, with full support for data dependencies, parallel execution, and error handling.
- 🔄 JSON-RPC Request Handling: Execute JSON-RPC 2.0 requests with automatic request ID management and error handling
- 🔀 Flow Control: Support for conditional execution, loops, and parallel processing with proper variable scoping
- 🔄 Data Transformation: Transform data between steps using map, filter, reduce, and other operations
- đź“Š Expression Evaluation: Dynamic expression evaluation with support for template literals and object paths
- đź”— Dependency Resolution: Automatic handling of data dependencies between steps
- 🎯 Type Safety: Written in TypeScript with comprehensive type definitions
- ⚡ Parallel Execution: Automatic parallel execution of independent steps
- 🔍 Error Handling: Detailed error reporting, validation, and graceful error recovery
- 🌍 Context Management: Global context available to all steps with proper scoping
- 📦 Batch Processing: Support for processing data in configurable batch sizes
Process team members with nested operations and dynamic notifications:
const teamFlow: Flow = {
name: 'team-member-processing',
description: 'Process team members and send notifications',
context: {
notificationTypes: {
welcome: 'WELCOME',
update: 'UPDATE',
},
},
steps: [
{
name: 'getTeams',
request: {
method: 'teams.list',
params: { active: true },
},
},
{
name: 'processTeams',
loop: {
over: '${getTeams.result}',
as: 'team',
step: {
name: 'processMembers',
loop: {
over: '${team.members}',
as: 'member',
step: {
name: 'processMember',
condition: {
if: '${member.active}',
then: {
name: 'notifyMember',
request: {
method: 'notify',
params: {
teamId: '${team.id}',
memberId: '${member.id}',
type: '${context.notificationTypes.welcome}',
data: {
teamName: '${team.name}',
memberRole: '${member.role}',
},
},
},
},
},
},
},
},
},
},
],
};
Process data with validation, transformation, and error handling:
const dataPipelineFlow: Flow = {
name: 'data-pipeline',
description: 'Process and transform data with error handling',
context: {
batchSize: 2,
minValue: 10,
retryCount: 3,
},
steps: [
{
name: 'getData',
request: {
method: 'data.fetch',
params: { source: 'test' },
},
},
{
name: 'validateData',
condition: {
if: '${getData.error}',
then: {
name: 'retryData',
loop: {
over: 'Array.from({ length: ${context.retryCount} })',
as: 'attempt',
step: {
name: 'retryFetch',
request: {
method: 'data.fetch',
params: {
source: 'test',
attempt: '${metadata.current.index + 1}',
},
},
},
},
},
else: {
name: 'processData',
transform: {
input: '${getData.result}',
operations: [
{
type: 'filter',
using: '${item.value > context.minValue}',
},
{
type: 'map',
using: '{ ...item, processed: true }',
},
],
},
},
},
},
{
name: 'processBatches',
loop: {
over: '${validateData.result.result}',
as: 'batch',
step: {
name: 'processBatch',
request: {
method: 'batch.process',
params: {
data: '${batch}',
index: '${metadata.current.index}',
},
},
},
},
},
],
};
Aggregate data from multiple API endpoints with parallel processing:
const apiAggregationFlow: Flow = {
name: 'api-aggregation',
description: 'Aggregate data from multiple APIs',
steps: [
{
name: 'fetchUsers',
request: {
method: 'users.list',
params: { status: 'active' },
},
},
{
name: 'fetchUserDetails',
loop: {
over: '${fetchUsers.result}',
as: 'user',
step: {
name: 'userDetails',
transform: {
input: '${user}',
operations: [
{
type: 'parallel',
operations: [
{
name: 'profile',
request: {
method: 'user.profile',
params: { userId: '${user.id}' },
},
},
{
name: 'activity',
request: {
method: 'user.activity',
params: { userId: '${user.id}' },
},
},
],
},
{
type: 'map',
using: `{
...user,
profile: ${profile.result},
recentActivity: ${activity.result}
}`,
},
],
},
},
},
},
{
name: 'aggregateData',
transform: {
input: '${fetchUserDetails.result.value}',
operations: [
{
type: 'group',
using: 'item.profile.department',
},
{
type: 'map',
using: `{
department: key,
userCount: items.length,
activeUsers: items.filter(u => u.recentActivity.length > 0).length
}`,
},
],
},
},
],
};
npm install @open-rpc/flow
Here's a simple example of defining and executing a flow:
import { FlowExecutor, Flow } from '@open-rpc/flow';
// Define your JSON-RPC handler
const jsonRpcHandler = async (request) => {
// Implement your JSON-RPC handling logic
return { result: 'Success' };
};
// Define a flow with data processing and error handling
const flow: Flow = {
name: 'Data Processing Flow',
description: 'Process and transform data with error handling',
context: {
minValue: 10,
},
steps: [
{
name: 'getData',
request: {
method: 'data.fetch',
params: { source: 'api' },
},
},
{
name: 'validateData',
condition: {
if: '${getData.result.length > 0}',
then: {
name: 'processData',
transform: {
input: '${getData.result}',
operations: [
{
type: 'filter',
using: '${item.value > context.minValue}',
},
{
type: 'map',
using: '{ ...item, processed: true }',
},
],
},
},
else: {
name: 'handleError',
request: {
method: 'error.log',
params: { message: 'No data found' },
},
},
},
},
],
};
// Execute the flow
const executor = new FlowExecutor(flow, jsonRpcHandler);
const results = await executor.execute();
A flow consists of a series of steps that can include:
Execute JSON-RPC requests with error handling:
{
name: 'getUser',
request: {
method: 'user.get',
params: { id: 1 }
}
}
Transform data using operations like map, filter, reduce:
{
name: 'processUsers',
transform: {
input: '${getUser.result}',
operations: [
{
type: 'filter',
using: '${item.active === true}',
},
{
type: 'map',
using: '{ id: item.id, name: item.name }',
},
{
type: 'reduce',
using: '[...acc, item.id]',
initial: [],
}
]
}
}
Execute steps based on conditions with error handling:
{
name: 'validateUser',
condition: {
if: '${getUser.error}',
then: {
name: 'handleError',
request: {
method: 'error.log',
params: { message: '${getUser.error.message}' }
}
},
else: {
name: 'processUser',
transform: {
input: '${getUser.result}',
operations: [
{
type: 'map',
using: '{ ...item, validated: true }'
}
]
}
}
}
}
Iterate over collections with batch processing:
{
name: 'processItems',
loop: {
over: '${getItems.result}',
as: 'item',
maxIterations: 100,
step: {
name: 'processItem',
request: {
method: 'item.process',
params: {
id: '${item.id}',
batchIndex: '${metadata.current.index}'
}
}
}
}
}
The engine supports dynamic expressions using the ${...}
syntax:
- Simple references:
${stepName}
- Property access:
${stepName.property}
- Array access:
${stepName[0]}
- Nested properties:
${stepName.nested.property}
- Template literals:
`Value: ${stepName.value}`
- Comparisons:
${value > 10}
- Object literals:
{ id: ${item.id}, name: ${item.name} }
- Error handling:
${stepName.error.message}
The engine provides detailed error information and recovery options:
try {
await executor.execute();
} catch (error) {
if (error instanceof JsonRpcRequestError) {
// Handle JSON-RPC specific errors
console.error('RPC Error:', error.error);
} else {
// Handle other execution errors
console.error('Execution Error:', error.message);
}
}
The engine is written in TypeScript and provides comprehensive type definitions:
interface Flow {
name: string;
description?: string;
context?: Record<string, any>;
steps: Step[];
}
type Step = RequestStep | TransformStep | ConditionStep | LoopStep;
interface RequestStep {
name: string;
request: {
method: string;
params?: Record<string, any>;
};
}
// More type definitions available in the source
Contributions are welcome! Please read our Contributing Guide for details on our code of conduct and the process for submitting pull requests.
This project is licensed under the MIT License - see the LICENSE file for details.