Releases: m5stack/StackFlow
Releases · m5stack/StackFlow
v1.6.0
- Modify llm-yolo to support asynchronous inference
- Add yolo.boxV2 return structure to llm-yolo
- Modify llm-llm to support asynchronous inference
- Optimize inference speed for llm-llm
- Optimize image data output for llm-camera
- Add openai-api module to provide OpenAI API services
- Add qwen2.5-1.5B-Int4-ax630c Int4 quantized model
- Add qwen2.5-0.5B-Int4-ax630c Int4 quantized model
V1.5.0
- Added experimental support for Axera VIN in llm-camera.
- Added experimental support for BSON in llm-sys.
- Added new models to llm-llm.
- Optimized LLM post-processing algorithms.
- Fixed the issue of incorrect token truncation in LLM output.
- Updated the internVL2.5-MPO model.
- Added the Deepseek-r1-distill-qwen2.5-0.5b model.
- Added a new prefill 256 LLM model.
Warning: This upgrade package requires installing lib-llm_1.6-m5stack1_arm64.deb first before installing other model packages.
v1.4.0: Merge pull request #4 from m5stack/dev
- Add whisper unit
- Add whisper-tiny model
- Add vad unit
- Add silero-vad model
- Add Qwen2.5-1.5B model
- Add depth_anything unit
- Add depth-anything model
- Optimize the initialization speed of the kws unit
- Optimize the initialization speed of the tts unit
LLM V1.3.0 Firmware Release Note
- add yolo unit.
- experimental add vlm unit.
- add camera unit.
- llama3.2-1B-prefill-ax630c llm mode.
- openbuddy-llama3.2-1B-ax630c llm mode.
- qwen2.5-coder-0.5B-ax630c llm mode.
- llm unit add tokenizer server.
- melotts unit fix english support.
- experimental add vlm unit internvl2-1B-ax630c mode.
rename StackFlow
v1.2.0 [update] compile tool link