Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Batch processing speeds up the process a little when working with larger batch sizes as we do not create a new onnx inference session for each image and we only read the CSV tags once per batch.
The model itself is not using batch inference as it seems the available onnx models have been compiled with batch size = 1 or I have not figured how to correctly pass the tensors to the input. I have looked at code from the kohya scripts but retrieving the supported batch size hasn't worked that way inside the comfyui tagger (but it might be my incorrect passing of the tensors).
Even without the model using batch processing we still gain some speed up of about 18-20% in my tests with a batch size of 6.