Skip to content

Releases: BerriAI/litellm

v1.67.5-nightly

30 Apr 05:16
839878f
Compare
Choose a tag to compare

What's Changed

  • [Docs] v1.67.4-stable by @ishaan-jaff in #10338
  • Prisma Migrate - support setting custom migration dir by @krrishdholakia in #10336
  • Fix: Prevent cache token overwrite by last chunk in streaming usage by @mdonaj in #10284
  • [UI] Fixes for sessions on UI - ensure errors have a session and use 1 session for test key by @ishaan-jaff in #10342
  • [UI QA Bug Fix] - Fix SSO Sign in flow by @ishaan-jaff in #10344
  • [UI] Fix infinite Scroll on Models on Test Key Page by @ishaan-jaff in #10343
  • [UI QA Fix] Fix width of the model_id on Models Page by @ishaan-jaff in #10345
  • Fix - support azure dall e custom pricing by @krrishdholakia in #10339
  • [Bug Fix] UI QA - Fix wildcard model test connection not working by @ishaan-jaff in #10347
  • Litellm UI improvements 04 26 2025 p1 by @krrishdholakia in #10346
  • [QA] Allow managing sessions with litellm_session_id by @ishaan-jaff in #10348
  • Handle more gemini tool calling edge cases + support bedrock 'stable-image-core' by @krrishdholakia in #10351
  • [Feat] Add logging callback support for /moderations API by @ishaan-jaff in #10390
  • [Reliability fix] Redis transaction buffer - ensure all redis queues are periodically flushed by @ishaan-jaff in #10393
  • [Bug Fix] Responses API - fix for handling multiturn responses API sessions by @ishaan-jaff in #10415
  • build(deps): bump axios, @docusaurus/core, @docusaurus/plugin-google-gtag, @docusaurus/plugin-ideal-image and @docusaurus/preset-classic in /docs/my-website by @dependabot in #10419
  • docs: Fix link formatting in GitHub PR template by @user202729 in #10417
  • docs: Improve documentation of phoenix logging by @user202729 in #10416
  • [Feat Security] - Allow blocking web crawlers by @ishaan-jaff in #10420
  • [Feat] Add support for using Bedrock Knowledge Bases with LiteLLM /chat/completions requests by @ishaan-jaff in #10413
  • Revert "build(deps): bump axios, @docusaurus/core, @docusaurus/plugin-google-gtag, @docusaurus/plugin-ideal-image and @docusaurus/preset-classic in /docs/my-website" by @ishaan-jaff in #10421
  • fix google studio url by @nonZero in #10095
  • [New model] Add openai/computer-use-preview cost tracking / pricing by @ishaan-jaff in #10422
  • fix(langsmith.py): respect langsmith batch size param by @krrishdholakia in #10411
  • Support x-litellm-api-key header param + allow key at max budget to call non-llm api endpoints by @krrishdholakia in #10392

New Contributors

Full Changelog: v1.67.4-nightly...v1.67.5-nightly

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.67.5-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 270.0 290.6566157251101 6.175800923917475 0.0 1848 0 232.84122400002616 2432.3238870000523
Aggregated Passed ✅ 270.0 290.6566157251101 6.175800923917475 0.0 1848 0 232.84122400002616 2432.3238870000523

v1.67.4-stable

27 Apr 14:26
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.67.0-stable...v1.67.4-stable

v1.67.4-nightly

27 Apr 02:03
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.67.3.dev1...v1.67.4-nightly

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.67.4-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.67.4-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 220.0 247.38248619018765 6.061326672784343 0.0 1814 0 197.54123199999185 2435.6727050000018
Aggregated Passed ✅ 220.0 247.38248619018765 6.061326672784343 0.0 1814 0 197.54123199999185 2435.6727050000018

v1.67.3.dev6

26 Apr 23:33
Compare
Choose a tag to compare

Full Changelog: v1.67.3.dev4...v1.67.3.dev6

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.67.3.dev6

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 240.0 263.90902888665886 6.165203372220349 0.0 1844 0 210.97686299992802 2930.0805719999516
Aggregated Passed ✅ 240.0 263.90902888665886 6.165203372220349 0.0 1844 0 210.97686299992802 2930.0805719999516

v1.67.3.dev4

26 Apr 22:30
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.67.3.dev1...v1.67.3.dev4

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.67.3.dev4

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 190.0 216.71376621493337 6.269037380681852 0.0 1875 0 164.64077799997767 4562.471842000036
Aggregated Passed ✅ 190.0 216.71376621493337 6.269037380681852 0.0 1875 0 164.64077799997767 4562.471842000036

v1.67.3.dev1

24 Apr 06:01
Compare
Choose a tag to compare

What's Changed

Full Changelog: v1.67.2-nightly...v1.67.3.dev1

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.67.3.dev1

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.67.3.dev1

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 210.0 235.18092614107948 6.181088327781123 0.0 1850 0 192.45027600004505 4892.269687999942
Aggregated Passed ✅ 210.0 235.18092614107948 6.181088327781123 0.0 1850 0 192.45027600004505 4892.269687999942

v1.67.2-nightly

24 Apr 05:56
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.67.1-nightly...v1.67.2-nightly

v1.67.1-nightly

22 Apr 22:55
a7db0df
Compare
Choose a tag to compare

What's Changed

Full Changelog: v1.67.0-nightly...v1.67.1-nightly

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.67.1-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 220.0 263.64700999835935 6.1132795166960605 0.0 1829 0 199.11094299999377 4358.182531000011
Aggregated Passed ✅ 220.0 263.64700999835935 6.1132795166960605 0.0 1829 0 199.11094299999377 4358.182531000011

v1.67.0-stable

19 Apr 19:35
03b5399
Compare
Choose a tag to compare

What's Changed

New Contributors

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.67.0-stable

Full Changelog: v1.66.0-stable...v1.67.0-stable

v1.67.0-nightly

19 Apr 23:32
Compare
Choose a tag to compare

What's Changed

  • [Feat] Expose Responses API on LiteLLM UI Test Key Page by @ishaan-jaff in #10166
  • [Bug Fix] Spend Tracking Bug Fix, don't modify in memory default litellm params by @ishaan-jaff in #10167
  • Bug Fix - Responses API, Loosen restrictions on allowed environments for computer use tool by @ishaan-jaff in #10168

Full Changelog: v1.67.0-stable...v1.67.0-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.67.0-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 230.0 262.85419851041036 6.266552109647687 0.0 1873 0 202.24337799993464 5393.98836700002
Aggregated Passed ✅ 230.0 262.85419851041036 6.266552109647687 0.0 1873 0 202.24337799993464 5393.98836700002