Skip to content

ShashaankS/Codecrusher

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

CodeCrusher: Remote Code Execution Engine

CodeCrusher is a distributed system that lets users submit code, execute it safely in isolated Docker containers, and fetch execution results asynchronously.

  • API + worker architecture
  • Queue-based background processing
  • Containerized code execution
  • Service orchestration with Docker Compose

Tech Stack

  • Node.js (API server + worker)
  • Express.js (HTTP API)
  • Redis + BullMQ (job queue)
  • MongoDB + Mongoose (result storage)
  • Docker + Docker Compose (runtime and orchestration)

Architecture Overview

Design

Request lifecycle (high-level)

  1. Client sends code to API.
  2. API creates a job in Redis/BullMQ and stores submission metadata.
  3. Worker consumes the job and executes code in a short-lived Docker container.
  4. Worker stores execution status/response in MongoDB.
  5. Client polls result endpoint using jobId.

Project Structure

  • server/ → HTTP API, queue producer, DB models, routes/controllers
  • worker/ → queue consumer and code execution logic
  • packages/ → shared internal packages (Logger, ErrorHandler)

Getting Started

Prerequisites

  • Docker and Docker Compose installed
  • Git installed

1) Clone the repository

git clone https://github.com/ShashaankS/Codecrusher.git
cd Codecrusher

2) Configure environment variables

cp .env.example .env

Default values in .env.example are enough for local Docker usage.

3) Start all services

Use either command below:

docker compose --env-file .env up -d --build

or

make start

4) Verify services

make status
make logs

Main containers:

  • code_crusher_server → API server (default port 3000)
  • code_crusher_worker → background code execution worker
  • code_crusher_redis → queue broker
  • code_crusher_mongo → persistence layer

API Usage

Base URL (local):

http://localhost:3000/api/v1

1) Submit code

Endpoint: POST /submit

Supported languages: python, javascript

Request body:

{
    "code": "print(1)",
    "language": "python"
}

Example cURL:

curl -X POST http://localhost:3000/api/v1/submit \
    -H "Content-Type: application/json" \
    -d '{"code":"print(1)","language":"python"}'

Success response:

{
    "success": true,
    "jobId": "123456"
}

2) Check execution result

Endpoint: GET /result?jobId=<jobId>

Example cURL:

curl "http://localhost:3000/api/v1/result?jobId=123456"

If job is still processing:

{
    "success": false,
    "status": "Pending"
}

If job is completed:

{
    "success": true,
    "status": "completed",
    "response": "1\n"
}

Key Design Principles

  • This system is asynchronous: submission and result retrieval are separate steps.
  • jobId is the link between submit and result APIs.
  • Worker-side execution is isolated using Docker for safer runtime behavior.
  • Queue-based design improves scalability and decouples API response time from execution time.

About

Distributed remote code execution engine with Node.js, BullMQ, Redis, MongoDB, and Docker for safe, asynchronous code runs.

Topics

Resources

Stars

Watchers

Forks

Contributors