This is an Express-based Node.js service for retrieving REI Network's block, transaction, and unique address (DAU) statistics from the past 24 hours.
- 📊 Get the number of blocks produced in the past 24 hours
- đź’° Count transactions in the past 24 hours
- 👥 Calculate unique address count (DAU - Daily Active Users)
- 🚀 High-performance batch block data retrieval with caching
- ⏰ Automated scheduled processing every 30 minutes via worker process
- đź’ľ Persistent cache with incremental updates
- 📚 Historical statistics storage and retrieval
- 🔄 Worker process architecture for non-blocking operations
- đź’š Health check endpoint with processing status
npm installCopy the environment configuration file:
cp env.example .envEdit the .env file to configure the REI Network RPC endpoint:
REI_RPC_URL=https://rpc.rei.network
PORT=3000
Production environment:
npm startDevelopment environment (auto-restart):
npm run devYou can also run the worker process independently for testing:
# Run worker once
npm run worker
# Run worker in development mode (auto-restart)
npm run worker:dev
# Test worker process functionality
npm run test:workerGET /api/stats/24h
Returns cached statistics for the past 24 hours (updated every 30 minutes).
Response Example:
{
"success": true,
"data": {
"totalBlocks": 28800,
"totalTransactions": 150000,
"uniqueAddresses": ["0x123...", "0x456..."],
"uniqueAddressCount": 5000,
"blockRange": {
"start": 1000000,
"end": 1028800,
"startTimestamp": "2024-01-14T10:30:00.000Z",
"endTimestamp": "2024-01-15T10:30:00.000Z"
},
"recordCount": 24,
"timestamp": "2024-01-15T10:30:00.000Z"
},
"message": "Successfully retrieved 24-hour statistics"
}Field Descriptions:
totalBlocks: Total number of blocks from last 24 records (24 hours)totalTransactions: Total number of transactions from last 24 recordsuniqueAddresses: Array of unique addresses (for debugging)uniqueAddressCount: Number of unique addresses (DAU)blockRange: Block range of the 24 records with timestampsstart: Starting block numberend: Ending block numberstartTimestamp: Timestamp of the first blockendTimestamp: Timestamp of the last block
recordCount: Number of records merged (should be 24 for 24 hours)timestamp: Data retrieval timestamp
GET /api/stats/records
Returns the last 24 records (24 hours worth of data).
Response Example:
{
"success": true,
"data": {
"records": [
{
"blockRange": {
"start": 1000000,
"end": 1000100,
"startTimestamp": "2024-01-15T10:00:00.000Z",
"endTimestamp": "2024-01-15T10:05:00.000Z"
},
"stats": {
"totalBlocks": 100,
"totalTransactions": 500,
"uniqueAddresses": ["0x123...", "0x456..."],
"uniqueAddressCount": 20
},
"timestamp": "2024-01-15T10:00:00.000Z"
}
],
"count": 24,
"totalRecords": 168
},
"message": "Successfully retrieved recent records"
}GET /api/stats/history
Returns all historical statistics (limited to 7 days retention).
Response Example:
{
"success": true,
"data": [
{
"blockRange": {
"start": 1000000,
"end": 1000100,
"startTimestamp": "2024-01-15T10:00:00.000Z",
"endTimestamp": "2024-01-15T10:05:00.000Z"
},
"stats": {
"totalBlocks": 100,
"totalTransactions": 500,
"uniqueAddresses": ["0x123...", "0x456..."],
"uniqueAddressCount": 20
},
"timestamp": "2024-01-15T10:00:00.000Z"
}
],
"message": "Successfully retrieved historical statistics"
}GET /health
Check service status.
Response Example:
{
"status": "healthy",
"timestamp": "2024-01-15T10:30:00.000Z",
"service": "REI Network DAU Stats",
"lastProcessedBlock": 1028800,
"lastProcessedBlockTimestamp": "2024-01-15T10:30:00.000Z",
"isProcessing": false
}- Worker Process Architecture: Scheduled processing runs in separate worker process
- Non-blocking Operations: Main Express server remains responsive during data processing
- Incremental Updates: Only process new blocks since last run
- Record-based Storage: Generate one record every 1200 blocks (1 hour)
- 24-hour Aggregation: Merge last 24 records for 24-hour statistics
- Data Retention: Automatically clean up data older than 7 days
- Parallel Retrieval: Use Promise.all to retrieve block data in parallel
- Address Deduplication: Use Set data structure to ensure address uniqueness
- Persistent Cache: File-based caching for fast API responses
- Worker Process: Data processing runs in separate process, not blocking main server
- Cached Responses: API returns instantly from cache
- Incremental Processing: Only process new blocks, not entire history
- Batch Processing: Process blocks in configurable batches
- Parallel Requests: Concurrent block data retrieval
- Error Handling: Comprehensive error handling and retry mechanisms
- Memory Optimization: Efficient address deduplication and storage
- Process Isolation: Worker crashes don't affect main server
| Variable Name | Default Value | Description |
|---|---|---|
REI_RPC_URL |
https://rpc.rei.network |
REI Network RPC endpoint |
PORT |
3000 |
Server port |
If you need to use a different REI Network RPC endpoint, modify the REI_RPC_URL variable in the .env file.
# Get 24-hour statistics
curl http://localhost:3000/api/stats/24h
# Health check
curl http://localhost:3000/healthconst axios = require('axios');
async function getStats() {
try {
const response = await axios.get('http://localhost:3000/api/stats/24h');
console.log('24-hour statistics:', response.data);
} catch (error) {
console.error('Failed to retrieve data:', error.message);
}
}
getStats();- Network Connection: Ensure the server can access the REI Network RPC endpoint
- Data Accuracy: Statistics are based on block timestamps and may have slight time deviations
- Cache Directory: The service creates a
cache/directory for storing statistics data - Worker Process: Statistics are updated automatically every 30 minutes via worker process
- Non-blocking: Main server remains responsive during data processing
- Record-based Storage: One record is generated every 1200 blocks (1 hour) for efficient storage
- 24-hour Aggregation: 24-hour statistics are calculated by merging the last 24 records
- Data Retention: Historical data older than 7 days is automatically cleaned up
- Incremental Updates: Only new blocks are processed, making the service very efficient
- Error Handling: The service includes comprehensive error handling and returns detailed error messages
- Data Persistence: Statistics are cached to disk and survive server restarts
- Process Isolation: Worker process crashes don't affect the main server
- Connection Timeout: Check network connectivity and RPC endpoint availability
- Inaccurate Data: Verify the data format returned by the RPC endpoint
- Insufficient Memory: For large datasets, consider increasing server memory or optimizing batch size
The service outputs detailed log information during operation, including:
- Block processing progress
- Cache operations
- Scheduled task execution
- Error messages
- Performance statistics
The service automatically manages cache files:
cache/statistics.json: Current 24-hour statisticscache/history.json: Historical processing data- Cache files are created automatically on first run
- Cache is updated incrementally with each scheduled task
- No manual cache management required
The service uses a worker process architecture for optimal performance:
- Runs the Express API server
- Handles HTTP requests and responses
- Manages worker process lifecycle
- Loads and serves cached statistics
- Runs scheduled data processing tasks
- Handles heavy computational work (block processing)
- Communicates with main process via IPC
- Can be run independently for testing
- Non-blocking: Main server remains responsive during data processing
- Isolation: Worker crashes don't affect the main server
- Scalability: Can easily add more worker processes if needed
- Testing: Worker can be tested independently
- Main process sends
run_taskmessages to worker - Worker sends
task_completedortask_errormessages back - All communication is asynchronous and non-blocking
MIT License