PentAGI v0.3.0 - First Public Beta Release
🚀 Join the Community! Connect with security researchers, AI enthusiasts, and fellow ethical hackers. Get support, share insights, and stay updated with the latest PentAGI developments.
🎯 Major Features
🤖 Assistant Mode - Complete interactive AI assistant with streaming responses, persistent chat sessions, and intelligent agent delegation. Create multiple chat sessions and seamlessly switch between manual assistance and automated penetration testing workflows.
🧪 Professional Testing Suite - Three specialized testing utilities:
- ctester: Test LLM agent configurations with parallel execution and detailed reporting
- etester: Manage vector embeddings with provider testing and database optimization
- ftester: Debug individual functions and AI behaviors with interactive mock modes
🔍 Enhanced Search Capabilities - Integrated Perplexity AI and DuckDuckGo search engines alongside existing providers, plus multi-provider embedding system supporting OpenAI, Ollama, Mistral, Jina, HuggingFace, GoogleAI, and VoyageAI.
🛡️ Custom Kali Linux Environment - Dedicated Docker image optimized for penetration testing with enhanced security tools and network admin capabilities. The open-source build configuration is available under MIT license with automated multi-platform builds and security attestations.
⚡ Enhanced LLM Integration - PentAGI now uses a custom fork of langchaingo with significant improvements for better LLM provider compatibility, enhanced function calling, streaming responses, and optimized external service integrations.
🚀 New Features
- Community Launch: Official Discord and Telegram channels for community support, knowledge sharing, and collaboration between security researchers and AI enthusiasts
- Flexible LLM Configuration: YAML/JSON configuration system for custom providers with per-agent model specifications (examples)
- Advanced Report Generation: Comprehensive Markdown and PDF reports for flows, tasks, and subtasks
- Smart Context Management: Enhanced conversation summarization with configurable preservation settings
- Message Copy & Search: Copy messages in Markdown format with text highlighting across all interfaces
- Provider Management: Visual icons and improved status indicators for OpenAI, Anthropic, and custom providers
🎨 UI/UX Improvements
- Streamlined Assistant Interface: New tab with chat creation, management, and persistent state
- Enhanced Navigation: Improved breadcrumbs with status and provider information
- Better Authentication: Enhanced GitHub/Google OAuth with password change functionality
- Improved Flow Management: Better status handling with proper state transitions and input blocking
- Professional Tooltips: Fixed positioning and enhanced visual feedback
🐛 Key Fixes
- Flow Status Synchronization: Resolved issues with status updates when switching between flows
- Assistant Integration: Fixed problems launching assistants on new and completed flows
- Terminal Synchronization: Improved command execution display between automated and manual agents
- Message Chain Consistency: Enhanced restoration and context handling after interruptions
- Configuration Issues: Resolved Docker, environment variables, and provider setup problems
🔧 Infrastructure Improvements
- Enhanced Container Security: Improved isolation with controlled network capabilities
- Environment Flexibility: ASK_USER interactive mode, proxy support, and SSL/TLS enhancements
- Build Optimization: Golang 1.24 upgrade, dependency updates, and improved Docker builds
- Configuration Management: Pre-built provider configs for OpenRouter, DeepInfra, and DeepSeek
- Custom Docker Images: Open-source Kali Linux containers with automated builds, multi-platform support, and security attestations
🔄 Performance & Architecture
- Agent System Refactoring: Major improvements to core execution logic with better modularity
- Memory Optimization: Enhanced context management and chain summarization for reduced footprint
- Database Performance: Optimized queries and improved vector storage operations
- Enhanced Prompt System: Unified templates with shared components and simplified handling
- LLM Library Improvements: Migration to custom langchaingo fork with enhanced streaming, function calling, and provider compatibility
📖 Documentation: For detailed setup instructions, visit the README and Quick Start Guide
New Contributors
- @dependabot made their first contribution in #4
- @hhktony made their first contribution in #32
Full Changelog: v0.2.0...v0.3.0