Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
chore(deps): update container image docker.io/localai/localai to v2.7…
….0@b689c91 by renovate (truecharts#17756) This PR contains the following updates: | Package | Update | Change | |---|---|---| | [docker.io/localai/localai](https://togithub.com/mudler/LocalAI) | minor | `v2.6.1` -> `v2.7.0` | --- > [!WARNING] > Some dependencies could not be looked up. Check the Dependency Dashboard for more information. --- ### Release Notes <details> <summary>mudler/LocalAI (docker.io/localai/localai)</summary> ### [`v2.7.0`](https://togithub.com/mudler/LocalAI/releases/tag/v2.7.0) [Compare Source](https://togithub.com/mudler/LocalAI/compare/v2.6.1...v2.7.0) <!-- Release notes generated using configuration in .github/release.yml at master --> This release adds support to the transformer backend for LLM as well! For now instance you can run codellama-7b with transformers with: docker run -ti -p 8080:8080 --gpus all localai/localai:v2.7.0-cublas-cuda12 codellama-7b In the quickstart there are more examples available https://localai.io/basics/getting_started/#running-models. Note: As llama.cpp is ongoing with changes that could possible cause breakage, this release does not includes changes from [ggerganov/llama.cpp#5138 (the future versions will). #### What's Changed ##### Bug fixes 🐛 - fix(paths): automatically create paths by [@&truecharts#8203;mudler](https://togithub.com/mudler) in [https://github.com/mudler/LocalAI/pull/1650](https://togithub.com/mudler/LocalAI/pull/1650) ##### Exciting New Features 🎉 - feat(transformers): support also text generation by [@&truecharts#8203;mudler](https://togithub.com/mudler) in [https://github.com/mudler/LocalAI/pull/1630](https://togithub.com/mudler/LocalAI/pull/1630) - transformers: correctly load automodels by [@&truecharts#8203;mudler](https://togithub.com/mudler) in [https://github.com/mudler/LocalAI/pull/1643](https://togithub.com/mudler/LocalAI/pull/1643) - feat(startup): fetch model definition remotely by [@&truecharts#8203;mudler](https://togithub.com/mudler) in [https://github.com/mudler/LocalAI/pull/1654](https://togithub.com/mudler/LocalAI/pull/1654) ##### 👒 Dependencies - ⬆️ Update ggerganov/llama.cpp by [@&truecharts#8203;localai-bot](https://togithub.com/localai-bot) in [https://github.com/mudler/LocalAI/pull/1642](https://togithub.com/mudler/LocalAI/pull/1642) - ⬆️ Update ggerganov/llama.cpp by [@&truecharts#8203;localai-bot](https://togithub.com/localai-bot) in [https://github.com/mudler/LocalAI/pull/1644](https://togithub.com/mudler/LocalAI/pull/1644) - ⬆️ Update ggerganov/llama.cpp by [@&truecharts#8203;localai-bot](https://togithub.com/localai-bot) in [https://github.com/mudler/LocalAI/pull/1652](https://togithub.com/mudler/LocalAI/pull/1652) - ⬆️ Update ggerganov/llama.cpp by [@&truecharts#8203;localai-bot](https://togithub.com/localai-bot) in [https://github.com/mudler/LocalAI/pull/1655](https://togithub.com/mudler/LocalAI/pull/1655) ##### Other Changes - ⬆️ Update ggerganov/llama.cpp by [@&truecharts#8203;localai-bot](https://togithub.com/localai-bot) in [https://github.com/mudler/LocalAI/pull/1632](https://togithub.com/mudler/LocalAI/pull/1632) - ⬆️ Update docs version mudler/LocalAI by [@&truecharts#8203;localai-bot](https://togithub.com/localai-bot) in [https://github.com/mudler/LocalAI/pull/1631](https://togithub.com/mudler/LocalAI/pull/1631) **Full Changelog**: mudler/LocalAI@v2.6.1...v2.6.2 </details> --- ### Configuration 📅 **Schedule**: Branch creation - "before 10pm on monday" in timezone Europe/Amsterdam, Automerge - At any time (no schedule defined). 🚦 **Automerge**: Enabled. ♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox. 🔕 **Ignore**: Close this PR and you won't be reminded about this update again. --- - [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check this box --- This PR has been generated by [Renovate Bot](https://togithub.com/renovatebot/renovate). <!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzNy4xNTYuNCIsInVwZGF0ZWRJblZlciI6IjM3LjE1Ni40IiwidGFyZ2V0QnJhbmNoIjoibWFzdGVyIn0=-->
- Loading branch information