You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello! I was trying to use your model to parse a reasonably large amount of text into SQL (similar size as Spider). I used the "serve" mode. I allowed 8GB for docker container with 2GB swap. At the beginning everything was running fine, but as more and more text gets parsed, the memory usage of the docker container gets larger and larger, and finally exited with Error 137 (OOM).
What should we do to fix or avoid this issue? Any advice is appreciated. Thank you!
The text was updated successfully, but these errors were encountered:
Hi @sythello,
Thanks for trying out our code! I'm sorry to hear that you have issues with memory consumption. I have not run into that issue myself, but I'm also working with much higher constraints: I usually use 32 GB when I run this code.
Can you say which process in the container is responsible for the issue?
Torsten
Hi Torsten,
Thanks for the prompt reply! I just checked the container. The relevant processes in the container I can see are "python" and "picard"; "python" is the one with increasing memory usage. Not sure whether this is helpful...
Hi @sythello, It is helpful, thanks! Unfortunately, at this time, I don't have a solution for you. The easiest "fix" would be to give the docker daemon more memory.
Hello! I was trying to use your model to parse a reasonably large amount of text into SQL (similar size as Spider). I used the "serve" mode. I allowed 8GB for docker container with 2GB swap. At the beginning everything was running fine, but as more and more text gets parsed, the memory usage of the docker container gets larger and larger, and finally exited with Error 137 (OOM).
What should we do to fix or avoid this issue? Any advice is appreciated. Thank you!
The text was updated successfully, but these errors were encountered: