Description
Thanks so much for your help and this amazing software!
LocalAI version:
d65214a commit on 4/24/2024
Environment, CPU architecture, OS, and Version:
Linux server 5.15.0-102-generic #112-Ubuntu SMP Tue Mar 5 16:50:32 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux
Describe the bug
Using the chroma.from_documents() function with either the LocalAIEmbeddings or OpenAIEmbeddings from langchain as the embedding function causes:
ERR Server error error="rpc error: code = Unavailable desc = error reading from server: EOF" ip=192.168.X.XXX latency=2.232229359s method=POST status=500 url=/embeddings
To Reproduce
Run local-ai compiled with CUBLAS
Expected behavior
I expected this to create a vector database and properly return the embeddings.
Logs
embedding_log.txt
Additional context
I have only managed to get the bert-embeddings backend to work properly, making me think I might have some sort of installation issue. The sentence-transformers backend produces incorrect embeddings for me and I get an error mentioned in another issue when I try to run a model in gguf format.
Additionally, sending a curl request or a basic request through OpenAI's python package to the server works perfectly fine. It is only when using specifically the from_documents method that it fails. I feel it has to have something to do with the two warning messages in the provided logs.