A Reddit bot for r/OpenUniversity that automatically responds to posts and comments mentioning Open University module codes with relevant module information.
This bot monitors the r/OpenUniversity subreddit and automatically replies when it detects OU module codes (e.g., TM112, M250). It provides module details including title, description, level, credits, and study time directly from the Open University website.
The system consists of three components:
- Submission Bot: Monitors new submissions for module codes
- Comment Bot: Monitors comments for module codes
- Module Scraper: Weekly scraper that updates the local database with current module information
- Docker and Docker Compose
- Reddit API credentials (client ID, client secret, username, password)
- Python 3.12 (for local development)
Create a .env file in the project root with the following variables:
# Reddit API credentials
CLIENT_ID=your_reddit_client_id
CLIENT_SECRET=your_reddit_client_secret
REDDIT_USERNAME=your_bot_username
REDDIT_PASSWORD=your_bot_password
REDDIT_USER_AGENT=ou_module_bot/1.0
# Bot configuration
SUBREDDIT=OpenUniversity
MAX_RETRY_ATTEMPTS=5
MAX_CONCURRENT_WORKERS=10
# Template configuration
MAKO_TEMPLATE_DIR=src/ou_bot/reddit_bot/templates
MAKO_MODULE_DIR=data/mako_modules
# Data source
OU_MODULE_URL=https://enrolment.open.ac.uk/page-data/courses/qualifications/page-data.json
DATABASE_NAME=data/ou_modules.dbStart all services:
docker-compose up -dThis will start:
- Both Reddit bots (submission and comment scanners)
- Module scraper that runs weekly
- Ofelia scheduler to manage scraping intervals
View logs:
docker-compose logs -f bots
docker-compose logs -f scraperStop all services:
docker-compose downInstall dependencies using UV:
pip install uv
uv syncRun individual components:
# Run submission bot
uv run submission_bot
# Run comment bot
uv run comment_bot
# Run module scraper
uv run module_scraperFormat code:
uv run black src/Run tests:
uv run pytest tests/- Bots continuously scan the configured subreddit for new submissions and comments
- When a post contains an OU module code pattern (e.g., TM112, M250), the bot detects it
- Bot queries the local SQLite database for module information
- Response is formatted using Mako templates
- Bot replies to the post/comment with formatted module details
- Fetches the complete list of OU modules adhering to the websites
robots.txt - Scrapes detailed information for each module using concurrent requests
- Stores or updates module data in the SQLite database
- Runs weekly via Ofelia scheduler to keep data current
- Retry logic with exponential backoff for Reddit API failures
- Rate limit detection and graceful handling
- Structured logging for monitoring and debugging
The bot uses SQLite to store module information. Data is automatically created and managed by the scraper.
Response templates are located in src/ou_bot/reddit_bot/templates/. The bot uses different templates for:
- Submission responses (detailed format)
- Comment responses (compact format)
- Module data tables
Bot not responding:
- Check Reddit API credentials in
.env - Verify bot account has sufficient karma
- Check logs for rate limiting errors
Scraper failing:
- Verify
OU_MODULE_URLis accessible - Check network connectivity
- Review logs for parsing errors
Database errors:
- Ensure
data/directory exists and is writable - Check database file permissions
See LICENSE file for details.
- Fork the repository
- Create a feature branch
- Make your changes
- Run tests and formatting
- Submit a pull request