A modern Django-based service built for the Ansible Automation Platform (AAP) ecosystem, featuring comprehensive task management, REST APIs, and automated background job processing.
- 🚀 Modern Django Architecture - Django 4.2+ with clean app-based structure
- 📊 Automated Task Management - Feature-flag controlled task groups with automatic routing
- ⚡ Smart Task Routing - Automatic submission to dispatcherd with no manual intervention
- 🔌 REST API - Versioned RESTful APIs with OpenAPI documentation
- 🔐 Authentication & Authorization - Django-Ansible-Base integration with RBAC
- 📈 Real-time Dashboard - Web-based task monitoring and management interface
- 🐳 Docker Ready - Simplified single-container deployment with PostgreSQL
- 🧪 Comprehensive Testing - Unit and integration tests with coverage reporting
- 📝 API Documentation - Interactive Swagger/OpenAPI documentation
- 🔧 Metrics Collection - Integrated metrics-utility for data collection
# Clone the repository
git clone <repository-url>
cd metrics-service
# Start all services
docker-compose up -d
# Create a superuser (optional)
docker-compose exec metrics-service python manage.py createsuperuser
Your service will be available at:
- Application: http://localhost:8000
- API Documentation: http://localhost:8000/api/docs/
- Admin Interface: http://localhost:8000/admin/
- Task Dashboard: http://localhost:8000/dashboard/
# Prerequisites: Python 3.11+, PostgreSQL 13+
# Create virtual environment
python -m venv venv
source venv/bin/activate # Windows: venv\Scripts\activate
# Install dependencies
pip install -e ".[dev]"
# Configure database (update as needed)
cp .env.example .env
# Set up database
python manage.py migrate
python manage.py metrics_service init-service-id
python manage.py metrics_service init-system-tasks
python manage.py createsuperuser
# Start complete service (Django + dispatcher + scheduler)
python manage.py metrics_service run
metrics-service/
├── apps/
│ ├── api/v1/ # REST API endpoints
│ ├── core/ # Core models and business logic
│ ├── dashboard/ # Web dashboard interface
│ └── tasks/ # Background task system
├── metrics_service/
│ ├── settings/ # Environment-specific settings
│ └── urls.py # URL configuration
├── tests/ # Test suite
├── config/ # Configuration files
└── docker-compose.yml # Container orchestration
Core Models (apps/core/models.py
)
- User management with Django-Ansible-Base
- Organization and team hierarchy
- RBAC permissions and roles
Task System (apps/tasks/
)
- Feature-flag controlled task groups (System, Anonymized Data, Metrics Collection)
- Automatic task routing with Django signals
- APScheduler integration for cron-based scheduling
- Dispatcherd background task execution
- Task execution tracking and monitoring
- Built-in task functions and metrics collection
API Layer (apps/api/v1/
)
- RESTful endpoints with filtering and pagination
- OpenAPI/Swagger documentation
- Authentication and permission controls
Dashboard (apps/dashboard/
)
- Real-time task monitoring
- Task creation and management interface
- Live status updates every 5 seconds
The API supports multiple authentication methods:
- Session authentication (for web interface)
- Token authentication
- OAuth2 tokens (for third-party integrations)
# List all tasks
GET /api/v1/tasks/
# Create a new task
POST /api/v1/tasks/
{
"name": "Data Cleanup",
"function_name": "cleanup_old_data",
"task_data": {"days_old": 30}
}
# Get running tasks
GET /api/v1/tasks/running/
# Retry a failed task
POST /api/v1/tasks/{id}/retry/
# Available task functions
GET /api/v1/tasks/available_functions/
System Tasks (always enabled):
cleanup_old_data
- Clean up old system datacleanup_old_tasks
- Clean up completed/failed taskssend_notification_email
- Send notification emailsprocess_user_data
- Process user data in background
Metrics Collection Tasks (feature flag controlled):
collect_anonymous_metrics
- Collect anonymous system metricscollect_config_metrics
- Collect configuration informationcollect_job_host_summary
- Collect job execution statisticscollect_host_metrics
- Collect host performance datacollect_all_metrics
- Run multiple collectors in sequence
The service includes an automated background task system with intelligent routing:
# Start complete service (Django + dispatcher + scheduler)
python manage.py metrics_service run
# Start with custom configuration
python manage.py metrics_service run --workers 4 --log-level DEBUG
# Individual components (for development)
python manage.py run_dispatcherd --workers 2
python manage.py metrics_service cron start
Tasks are automatically routed based on their properties:
- Immediate tasks → Direct to dispatcherd
- Scheduled tasks → APScheduler with DateTrigger
- Recurring tasks → APScheduler with CronTrigger
No manual intervention required - create a task and it's automatically processed!
Control task execution with environment variables:
# Enable/disable anonymized data collection
METRICS_SERVICE_ANONYMIZED_DATA=true
# Enable/disable metrics collection
METRICS_SERVICE_METRICS_COLLECTION=false
# Format code
black .
# Lint code
ruff check .
# Type checking
mypy .
# Sort imports
isort .
This project uses pre-commit hooks to ensure code quality and automatically sync requirements files:
# Install pre-commit hooks
pre-commit install
# Run hooks on all files
pre-commit run --all-files
# Run hooks manually
pre-commit run
The pre-commit configuration automatically:
- Syncs requirements files when
pyproject.toml
oruv.lock
changes - Ensures requirements files are always up-to-date before commits
# Run all tests
pytest
# Run with coverage
pytest --cov=apps --cov=metrics_service --cov-report=html
# Run specific test categories
pytest -m unit # Unit tests only
pytest -m integration # Integration tests only
# Create migrations
python manage.py makemigrations
# Apply migrations
python manage.py migrate
# Initialize DAB ServiceID (required after first migration)
python manage.py metrics_service init-service-id
# Initialize system tasks
python manage.py metrics_service init-system-tasks
Key configuration options:
# Database
METRICS_SERVICE_DB_HOST=localhost
METRICS_SERVICE_DB_PORT=55432
METRICS_SERVICE_DB_USER=metrics_service
METRICS_SERVICE_DB_PASSWORD=metrics_service
METRICS_SERVICE_DB_NAME=metrics_service
# Django
METRICS_SERVICE_SECRET_KEY=your-secret-key
METRICS_SERVICE_DEBUG=false
METRICS_SERVICE_ALLOWED_HOSTS=localhost,yourdomain.com
# Task Feature Flags
METRICS_SERVICE_ANONYMIZED_DATA=true
METRICS_SERVICE_METRICS_COLLECTION=false
Configuration is managed through:
- Environment variables - Runtime configuration
config/settings.yaml
- Complex configuration via Dynaconf
# Build production image
docker build -t metrics-service .
# Run with production settings
docker run -p 8000:8000 \
-e METRICS_SERVICE_ENV=production \
-e METRICS_SERVICE_SECRET_KEY=your-secret-key \
-e METRICS_SERVICE_DB_HOST=your-db-host \
metrics-service
## Contributing
1. Fork the repository
2. Create a feature branch: `git checkout -b feature/my-feature`
3. Make your changes with tests
4. Run the test suite: `pytest`
5. Run code quality checks: `ruff check . && black . && mypy .`
6. Submit a pull request
### Development Standards
- **Code Style**: Black formatting, 120 character line length
- **Type Hints**: Required for all new code
- **Documentation**: Docstrings for public APIs
- **Testing**: Test coverage for new features
- **Commits**: Conventional commit messages
## License
This project is licensed under the Apache License - see the [LICENSE](LICENSE) file for details.
## Support
- **Documentation**: Check the [CLAUDE.md](CLAUDE.md) file for detailed development guidance
- **Issues**: Report bugs and feature requests via GitHub issues
- **API Documentation**: Interactive docs available at `/api/docs/` when running