You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
If one image cluster of the novelty detection is very large (e.g. 4k images) the novelty detection could run out of disk space because it generates the (64 MB) novelty maps of each image first and then post-processes the maps. This can't really be fixed because it needs all novelty maps to determine the segmentation threshold and then the original maps to generate the actual segmentation. But maybe the novelty maps could be compressed to allow processing of larger image collections?
The text was updated successfully, but these errors were encountered:
If one image cluster of the novelty detection is very large (e.g. 4k images) the novelty detection could run out of disk space because it generates the (64 MB) novelty maps of each image first and then post-processes the maps. This can't really be fixed because it needs all novelty maps to determine the segmentation threshold and then the original maps to generate the actual segmentation. But maybe the novelty maps could be compressed to allow processing of larger image collections?
The text was updated successfully, but these errors were encountered: