- Multiple bugs in the UI.
- Multiple bugs in the LLM-based test generation algorithm.
- Major refactoring: removed most of the global services.
- Migration to Gradle IntelliJ Plugin 2.x,
- Full LLM-based Kotlin test generation for line, method/function, and class.
- Integration with HuggingFace.
- Support IDEA
242.*
.
- Several minor bugs related to JUnit5 support and default template.
- An error with saving settings data between plugin runs
- Major refactoring of the plugin, including separation of common code sections for use in both Intellij plugin and Fleet plugin
- Prompt structure improving
- Supporting JUnit5 test generation for LLM-based algorithm
- Supporting tests samples detector for LLM
- Supporting default structures for requests to LLM in test cases
- Supporting of different LLM prompt templates
- Supporting the newer versions of IntelliJ IDEA
- EvoSuite's communication port is now changeable in the plugin's settings.
- a bug related to predicting if a prompt is larger than maximum prompt size
- Test execution tasks will run in background (test execution does not freeze the IDE)
- minor bug in displaying number of passed tests
- Better default prompt
- Better user's request handling
- Model selection for JetBrains AI Assistant platform
- a bug related to predicting if a prompt is larger than maximum prompt size
- Test execution tasks will run in background (test execution does not freeze the IDE)
- minor bug in displaying number of passed tests
- Better default prompt
- Better user's request handling
- Model selection for JetBrains AI Assistant platform
- Windows compatibility issue
- Improve and refactor the test execution process.
- Minor bugs in new test generation UI
- Minor bugs in general test generation process
- Prompt editor for LLM-based test generation
- The usage guide window
- Support build 233.*
- New UI for requesting a test generation process
- Minor bugs in LLM-based test generation process
- Code highlighting and auto-completion for generated tests
- Java code formatter for generated tests
- Direct user feedback to modify each generated test by LLM
- New UI for test generation report
- Improve the test case execution process
- Improve prompt generation
- LLM-based test generation using Grazie platform in the settings
- Fix compilation issue in LLM-based test generation
- Fix Evosuite test generation freezing in case that a wrong java path is provided
- LLM-based test generation using OpenAI platform
- Visualizing the result of tests executions
- Plugin's logo
- Some bugs in parsing tests
- Improving user interaction with test cases
- plugin's name
- LLM-based test generation for lines and methods
- Support mocking-related annotations in generated tests
- Smarter prompt generation to make sure that its number of tokens does not exceed the limits.
- Some performance and functional bugs in interaction with Jacoco
- Code refactoring
- LLM-based test generation using LLM platform
- Test execution and coverage information collector for tests generated by LLM
- Feedback cycle between test compilation and LLM to ensure tests generated by LLM can be compiled
- Better error handling
- Some minor bugs regarding test handling
- Improved test generation by aborting incomplete project builds and ensuring successful project builds before test generation begins
- Catching Evosuite errors related to unknown classes by checking for incorrect input of .class files
- Automatic creation of test files with the automatic addition includes and package lines
- Error checking for target class initialization
- Compatibility with IDEA 231.*
- Automated project build
- Compiled classes detector