·
12 commits
to main
since this release
Release Notes: Experiment v1.0.0-rc.6
✨ New Features
Local Inference Support (Experimental)
- Added preliminary support for running inference locally on macOS devices
- Compatible with select MLX models including Mistral and Gemma variants
- Automatic backend activation when local provider is selected
Timestamps for Messages
- Added timestamp display for chat messages with a clean, unobtrusive design
- Timestamps show when messages were sent, enhancing conversation context
Identity Support
- Added support for user and assistant identities in conversations
- New template example showcasing identity features
- Improved name and pronoun handling throughout the interface
🔄 Model Updates
- Added support for the latest models from OpenAI, Anthropic, and Mistral
- Added O3, O4 Mini, and other new model options
- Improved model labeling system for better organization
🛠️ Technical Improvements
- Refactored experiment storage for better performance
- Enhanced message handling architecture
- Improved error handling throughout the application
- Updated dependencies to latest versions
🐛 Bug Fixes
- Fixed various UI rendering issues
- Improved mobile header display
- Enhanced dark mode contrast for better readability
- Fixed provider selection and token management
This release represents a significant step forward in our experimental journey, with a cleaner interface, expanded model support, and the groundwork for local inference capabilities.