This guide covers the configuration and customization options for ConcourseGPT.
# Base URL for the LLM API endpoint
export LLM_API_BASE="https://your-llm-api.example.com"
# Model identifier to use
export LLM_MODEL="your-model-name"
# API token for authentication
export LLM_TOKEN="your-api-token"
These can be added to your shell's rc file (e.g., ~/.bashrc or ~/.zshrc) or a local env.sh:
# env.sh
export LLM_API_BASE="..."
export LLM_MODEL="..."
export LLM_TOKEN="..."
Then source it:
source env.sh
# Enable verbose curl output for debugging
export DEBUG_CURL=1
# Override the default columns setting for output formatting
export COLUMNS=120
The LLM API endpoint should:
- Accept POST requests to
/api/v1/chat/completions
- Use standard chat completion format
- Accept these parameters:
{ "model": "model-name", "messages": [ { "role": "user", "content": "prompt-text" } ], "temperature": 0.7, "max_tokens": 2000 }
- Return responses in this format:
{ "choices": [ { "message": { "content": "response-text" } } ] }
The MkDocs site is configured with sensible defaults, but you can customize various aspects:
The default theme (Material) is configured in mkdocs.yml
:
theme:
name: material
custom_dir: overrides
palette:
- media: "(prefers-color-scheme: light)"
scheme: default
primary: "blue"
accent: "blue"
toggle:
icon: material/weather-night
name: Switch to dark mode
- media: "(prefers-color-scheme: dark)"
scheme: slate
primary: "blue"
accent: "blue"
toggle:
icon: material/weather-sunny
name: Switch to light mode
Custom footer content can be modified in overrides/partials/footer.html
:
<p>This documentation was generated by an AI</p>
The navigation is automatically generated based on your docs/ directory structure:
nav:
- Home: README.md
- pipeline-name:
- Overview: pipeline-name/README.md
- Jobs: pipeline-name/jobs/
- Resources: pipeline-name/resources/
- Groups: pipeline-name/groups/
By default, documentation is generated in this structure:
docs/
└── pipeline-name/
├── index.md
├── jobs/
├── resources/
└── groups/
The following Markdown extensions are enabled:
markdown_extensions:
- toc:
permalink: true
toc_depth: 3
For large pipelines:
- Default chunk size: 600 lines
- Configurable via
line_threshold
in pipeline.sh - Affects memory usage and API request size
API calls have built-in retry logic:
- Maximum attempts: 3
- Delay between retries: 2 seconds
- Configurable in llm.sh