Skip to content

Releases: BerriAI/litellm

v1.76.0.dev2

28 Aug 23:08
Compare
Choose a tag to compare

Full Changelog: 1.76.0.dev1...v1.76.0.dev2

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.76.0.dev2

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Failed ❌ 110.0 143.15108061148368 6.459849558694928 6.459849558694928 1933 1933 87.39768300000605 2923.3753640000086
Aggregated Failed ❌ 110.0 143.15108061148368 6.459849558694928 6.459849558694928 1933 1933 87.39768300000605 2923.3753640000086

1.76.0.rc.1

27 Aug 16:52
Compare
Choose a tag to compare

What's Changed

New Contributors

Read more

1.76.0.dev1

27 Aug 22:36
Compare
Choose a tag to compare

Full Changelog: 1.76.0.rc.1...1.76.0.dev1

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-1.76.0.dev1

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Failed ❌ 62 100.40888889168315 6.512486450300847 6.512486450300847 1948 1948 43.315152999980455 2949.629679999987
Aggregated Failed ❌ 62 100.40888889168315 6.512486450300847 6.512486450300847 1948 1948 43.315152999980455 2949.629679999987

v1.76.0-nightly

24 Aug 06:03
Compare
Choose a tag to compare

What's Changed

Full Changelog: v1.76.0-stable-draft...v1.76.0-nightly

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.76.0-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.76.0-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.76.0-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Failed ❌ 110.0 146.96742692248557 6.510920428813231 6.510920428813231 1948 1948 90.41871399995216 2697.183144999997
Aggregated Failed ❌ 110.0 146.96742692248557 6.510920428813231 6.510920428813231 1948 1948 90.41871399995216 2697.183144999997

v1.76.0-stable-draft

23 Aug 20:33
Compare
Choose a tag to compare
v1.76.0-stable-draft Pre-release
Pre-release

What's Changed

New Contributors

Read more

v1.75.9-nightly

20 Aug 18:34
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.75.8-nightly...v1.75.9-nightly

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.75.9-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 98 143.26229013447838 6.435162941463474 0.0 1926 0 70.0977230000035 1988.133740999956
Aggregated Passed ✅ 98 143.26229013447838 6.435162941463474 0.0 1926 0 70.0977230000035 1988.133740999956

v1.75.9.dev3

19 Aug 23:46
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.75.8-nightly...v1.75.9.dev3

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.75.9.dev3

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 130.0 167.2434307868771 6.365346480657209 0.0 1905 0 103.2638190000057 1092.3964670000146
Aggregated Passed ✅ 130.0 167.2434307868771 6.365346480657209 0.0 1905 0 103.2638190000057 1092.3964670000146

v1.75.8-stable

25 Aug 15:38
Compare
Choose a tag to compare

What's Changed

Full Changelog: v1.75.8-nightly...v1.75.8-stable

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:litellm_stable_release_branch-v1.75.8-stable

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Failed ❌ 120.0 151.45218493123352 6.364773520109997 6.364773520109997 1905 1905 90.27845100001741 2454.094601999998
Aggregated Failed ❌ 120.0 151.45218493123352 6.364773520109997 6.364773520109997 1905 1905 90.27845100001741 2454.094601999998

v1.75.5-stable

17 Aug 21:42
Compare
Choose a tag to compare

Full Changelog: v1.75.5.rc.1...v1.75.5-stable

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:litellm_stable_release_branch-v1.75.5-stable

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:litellm_stable_release_branch-v1.75.5-stable

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 160.0 192.4000595502936 6.278691415340353 0.0 1879 0 117.62542199994641 1498.513451000008
Aggregated Passed ✅ 160.0 192.4000595502936 6.278691415340353 0.0 1879 0 117.62542199994641 1498.513451000008

v1.75.8-nightly

16 Aug 22:30
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.75.7-nightly...v1.75.8-nightly

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.75.8-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 140.0 192.18510385185118 6.314985319551383 0.0 1890 0 114.82424099995114 1309.5362409999325
Aggregated Passed ✅ 140.0 192.18510385185118 6.314985319551383 0.0 1890 0 114.82424099995114 1309.5362409999325