RexAI is an AI-powered implementation of the Chrome Dinosaur game using the NEAT (NeuroEvolution of Augmenting Topologies) algorithm. Watch as neural networks evolve to learn and master the game through generations of training!
This project demonstrates how evolutionary algorithms can be used to create intelligent agents that learn through simulation, without being explicitly programmed with game rules.
- 🦖 Chrome Dinosaur-inspired game implementation using PyGame
- 🧠 NEAT algorithm implementation for neural network evolution
- 📊 Real-time visualization of network structure
- 🔄 Save/load functionality for continuing training sessions
- 📈 Training management system with species configuration
- 📝 Command-line interface for controlling training parameters
- Python 3.7+
- PyGame
- NEAT-Python
- Colorama
- Clone the repository:
git clone https://github.com/yourusername/rexai.git
cd rexai
- Install dependencies:
pip install pygame neat-python colorama
- Run the game:
python -m rexai
RexAI comes with several command-line options to control the training process:
python -m rexai --config species.json --species "Rex 7 g-20.0" --generations 200
--config
,-c
: Path to JSON configuration file--species
,-s
: Name of the species to load--fresh
,-f
: Start a new training session (ignores existing population)--list
,-l
: List all available species in the configuration--generations
,-g
: Number of generations to run (default: 100)
RexAI uses NEAT (NeuroEvolution of Augmenting Topologies), an evolutionary algorithm that evolves neural networks. The system works through the following process:
- Initialization: Creates a population of simple neural networks
- Evaluation: Each network controls a dinosaur in the game
- Selection: Networks that perform better (survive longer) have higher fitness scores
- Reproduction: Top-performing networks reproduce, with mutations introducing variations
- Iteration: The process repeats, evolving more sophisticated behaviors over generations
The AI receives several inputs about the game state:
- Distance to the next obstacle
- Type of obstacle (cactus or bird)
- Game speed
- Dinosaur state (jumping, ducking, running)
- Obstacle dimensions and position
The network decides between three possible actions:
- Jump
- Duck
- Run (continue running normally)
rexai/
├── __init__.py
├── main.py # Main entry point
├── controllers/
│ └── ai_controller.py # NEAT controller implementation
├── ai/
│ └── networks/
│ └── network.py # DinoNetwork class (neural network)
├── game/
│ ├── dino_game.py # Game loop and rendering
│ ├── dino.py # Dinosaur player entity
│ └── obstacle.py # Game obstacles (cacti and birds)
├── utils/
│ ├── config_manager.py # Handles configuration and species
│ └── training_manager.py # Manages training process
├── config/
│ └── neat_config.txt # NEAT algorithm parameters
├── tests/ # Directory for saved populations/genomes
└── data/ # Pre-trained models
RexAI automatically saves progress every 10 generations. You can load a previous training session using the species configuration file.
# List available species
python -m rexai --config species.json --list
# Load a specific species and continue training
python -m rexai --config species.json --species "Rex 7 g-20.0" --generations 200
# Start fresh training
python -m rexai --fresh --generations 300
The neat_config.txt
file contains parameters for the NEAT algorithm, including:
- Population size
- Mutation rates
- Network structure parameters
- Species compatibility thresholds
The species.json
file manages different trained "species" of dinosaurs, allowing you to save and load different training runs.
- NEAT-Python for the NEAT implementation
- PyGame for the game engine
- The Chrome Dinosaur game for inspiration