Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

out of memory issue #15

Open
jjhesk opened this issue Jul 26, 2016 · 3 comments
Open

out of memory issue #15

jjhesk opened this issue Jul 26, 2016 · 3 comments

Comments

@jjhesk
Copy link

jjhesk commented Jul 26, 2016

I have uploaded an image with this spec:
13.62MB
original dimension: 15822 x 1897

basemap-1469553899965

The issue that is running out memory on each every time. With the heroku free dyno it gives me R14 error on the first try when the RAM exceed 500MB. Then I pull the code down and run it on windows with 8GB ram 64bit.

I use parallel limit =5 which is the default setting and i found that will crash every time on the zoom level 6. I think the better design show give the gm to release the ram when starting the next zoomlevel. that will squeeze out some memory from the process. i can think of another way using serial async process.

my suggestion or idea:

  1. parallelLimit can be calculated based on tiles number
  2. using serial sync process

maybe if that has another callback calculation strategy before firing up the command on gm or im would be great. because you dont know what is the computation restrain in different environments.

@martinheidegger
Copy link
Owner

I am sorry that your computers run out of memory. I think it might be good to have different async limits bases on the size of the current zoom level:
zoom level 1 (biggest size): parallel tasks 1
zoom level 2: parallel tasks 4
zoom level 3: parallel tasks 8
zoom level 4: parallel tasks 16 ...
etc.

@jjhesk
Copy link
Author

jjhesk commented Jul 28, 2016

is that number can be calculated based on the memory available in the system? For example, if you have found out that the current system have 512MB, 1GB, 2GB, 3GB,..

@martinheidegger
Copy link
Owner

No, its a logical step: The image processing system needs to be able to process the biggest image. With every zoom step the image should become 1/4th of the size, which means that the memory should stay constant at the minimum required.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants